Enabling Gesture on a Resource-Constrained Device: A Tiny-ML Approach with Envelope EMG Data and Real-Time Testing
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Gestures are crucial to human-computer interaction. They link human objectives to machine control. Electromyography (EMG) pattern recognition has been studied for gesture recognition to control prostheses and rehabilitative equipment for years. EMG records muscle electrical activity. Current real-time classification systems are accurate or have latency and power consumption concerns. Balancing these factors is difficult while designing such systems. The paper presents a TinyML-based gesture classification method from EMG to address these issues and adequately classify hand motions. Research employing the EMG Module captured envelope EMG signals for six hand gestures from 10 healthy individuals. The subject does two six-hand gesture sequences given four seconds per motion. After obtaining EMG data, windowing is used to segment and extract features. An artificial neural network categorised EMG data. During the classification process, 60% of the samples were utilised in the training phase, 20% in the validation phase, and 20% in the testing phase. The model accurately identified six hand motions with 98.35%. After conversion to TensorFlow Lite, the TinyML-based system was deployed on the Raspberry Pi Pico, proving its practicality. The real-time testing accuracy approach was utilised to analyse individual motions with 95 to 99 per cent accuracy, and the model's reaction time was calculated at 23 milliseconds. The suggested technique is helpful for human-machine interaction and smart device control in prosthetic, rehabilitation, smart wheelchairs, intelligent entertainment, and biomedical engineering.