Deep Learning-Based Approach for Emotion Classification Using Stretchable Sensor Data

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Facial expressions play a vital role in human communication, especially for individuals with motor impairments who rely on alternative interaction methods. This study presents a deep learning-based approach for real-time emotion classification using stretchable strain sensors integrated into a wearable system. The sensors, fabricated with conductive silver ink on a flexible Tegaderm substrate, detect subtle facial muscle movements. Positioned strategically on the forehead, upper lip, lower lip, and left cheek, these sensors effectively capture emotions such as happiness, neutrality, sadness, and disgust. A data pipeline incorporating Min-Max normalization and SMOTE balancing addresses noise and class imbalances, while dimensionality reduction techniques like PCA and t-SNE enhance data visualization. The system’s classification performance was evaluated using standard machine learning metrics, achieving an overall accuracy of 76.6%, with notable success in distinguishing disgust (86.0% accuracy) and neutrality (81.0% accuracy). This work offers a flexible, cost-effective, and biocompatible solution for emotion recognition, with potential applications in rehabilitation robotics, assistive technologies, and human-computer interaction.

Article activity feed