Minority report is not fiction anymore.Either for a better user experience or for pandemic precautionary measures gesture-based control can bring benefits. For demonstration purposes we have created 4 classes to distinguish basic gestures, but the model can be trained with any gestures providing a wide range of new features to the final user. NanoEdge AI Studio supports the Time-of-Flight sensor, but this application can be addressed with other sensor such as radar and more.
Approach
We are using a Time-of-Flight sensor rather than a camera. This reduces the number of signals to process and get only the necessary information
We set a detection distance to 20 cm to reduce the influence of the background
The sampling frequency of the sensor is set to its maximum (15 Hz) to capture gesture with a normal velocity
We created a dataset with 1200 records per class, avoiding empty measurement (no motion).
The data logging is very easy to manage with the evaluation board connected to the PC running NEAI Studio.
Finally, we created an 'N-Class classification' model (4 classes) in NanoEdge AI Studio and tested it live on a NUCLEO_F401RE. (with a X-NUCLEO-53L5A1 add-on board)
Sensor
Data
4 classes of data Up, down, left and right movements
Length data 256, 4 successive matrixes of 8x8
Data rate 15Hz
Results
4 classes classification:
98.12% accuracy, 1.3 KB RAM, 59.1 KB FlashGreen points represent well classified signals. Red points represent misclassified signals. The classes are on the abscissa and the confidence of the prediction is on the ordinate