ATHENA: Automatically Tracking Hands Expertly with No Annotations

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Studying naturalistic hand behaviours is challenging due to the limitations of conventional marker-based motion capture, which can be costly, time-consuming, and encumber participants. While markerless pose estimation exists – an accurate, off-the-shelf solution validated for hand-object manipulation is needed. We present ATHENA (Automatically Tracking Hands Expertly with No Annotations), an open-source, Python-based toolbox for 3D markerless hand tracking. To validate ATHENA, we concurrently recorded hand kinematics using ATHENA and an industry-standard optoelectronic marker-based system (OptiTrack). Participants performed unimanual, bimanual, and naturalistic object manipulation and we compared common kinematic variables like grip aperture, wrist velocity, index metacarpophalangeal flexion, and bimanual span. Our results demonstrated high spatiotemporal agreement between ATHENA and OptiTrack. This was evidenced by extremely high matches (R 2 > 0.90 across the majority of tasks) and low root mean square differences (< 1 cm for grip aperture, < 4 cm/s for wrist velocity, and < 5-10° for index metacarpophalangeal flexion). ATHENA reliably preserved trial-to-trial variability in kinematics, offering identical scientific conclusions to marker-based approaches, but with significantly reduced financial and time costs and no participant encumbrance. In conclusion, ATHENA is an accurate, automated, and easy-to-use platform for 3D markerless hand tracking that enables more ecologically valid motor control and learning studies of naturalistic hand behaviours, enhancing our understanding of human dexterity.

Article activity feed