Have you ever dreamed of controlling a machine using hand gestures?
What if your phone could send your friends emojis, based on the movement of your hands?
It has now become a reality! Thanks to ST multi-zone Time-of-Flight (ToF) sensors, this solution does not require a camera. AI algorithms are running on the STM32 microcontroller, with a low processing complexity and requiring low power consumption.
Define your own set of hand postures, collect your dataset, train your AI model, and create your application !
Approach
This hand posture recognition solution detects a set of hand postures with an ST multi-zone Time-of-Flight sensor and runs on a
NUCLEO-F401RE.
The development process is based on the following steps:
- Define your own set of hand postures (dataset)
- Collect your dataset from several users, using the distance and signal data from 8 x 8 multi-zone ToF sensors, such as the VL53L5CX, VL53L7CX or VL53L8CX
- Train the AI network with the training script from the STM32 model zoo
- Implement the AI model into your selected STM32 MCU thanks to the STM32Cube.AI Developer Cloud or the "Hand Posture Getting Started" included in the STM32 model zoo
This approach allows you to quickly develop this application, with highly configurable hand postures, a small memory footprint and a low processing power.
Depending on the application, the ToF sensor can be positioned either in front of the user (personal computer, satisfaction box), point to the ceiling (cooking plate) or be fixed on a moving object (smart glasses).
Sensor
Data
Dataset Private dataset: 5 users, 7 hand postures
Data format
8 x 8 ranging distance and signal rate
Frequency aligned with the application and reactivity needed
Results
Model CNN 2DMemory footprint:29 Kbytes of flash memory for weights
3 KbytesRAM for activations
Accuracy: 96.4 % Performance on STM32F401 @ 84 MHz Inference time:
1.5 msConfusion matrix