Grasping Emotion: A Vision-Based Study of Hand Movement and Feeling
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
AbstractAims: To explore how emotional responses to various objects are reflected in hand kinematics during natural grasping.Methods: Using a smartphone camera combined with MediaPipe and OpenCV, we recorded hand movements from 20participants as they grasped five distinct objects designed to evoke specific emotions (e.g., fear, disgust, comfort) undereither visual or non-visual conditions. We extracted kinematic features including velocity, grasping frequency, handopenness, movement efficiency, and stability, alongside self-reported emotional responses.Results: Emotionally evocative objects produced significantly different grasping patterns, with the fear-inducing spidereliciting the highest movement velocity and frequency, and the disgust-evoking donut producing slower, minimalcontactbehavior. High-arousal emotions were descriptively associated with greater movement velocity, but not withgrasping frequency. Visual preview of objects did not significantly alter grasping behavior.Conclusion: Emotional content shapes motor patterns during object interaction. Accessible vision-based tools cancapture these signatures in naturalistic settings, highlighting opportunities for affect-aware technologies and researchon embodied emotion.Keywords: Emotion recognition; Hand kinematics; Grasping behavior; Affective motor responses; Object interaction;Computer vision; MediaPipe; Human–object interaction.