Learning to operate a sensorized prosthetic hand assessed with a portable system based on computer vision, eye tracking and prosthetic sensors
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
As prosthetic technologies continue to develop, the demand is growing for the methods that objectively assess their impact on user operations. The existing evaluation tools primarily rely on subjective questionnaires and task performance metrics, which do not capture in real time how users handle prosthetic devices. Here we developed a portable system that combines prosthetic sensors, eye-tracking and computer vision–based object segmentation to access visuomotor behavior involved in object manipulation by a prosthetic hand equipped with artificial somatic sensations. The system enables comprehensive analysis of visual attention and motor coordination across distinct manipulation phases. Its mobile design allows deployment in everyday environments. We suggest that this approach will improve the training of patients to operate sensorized prosthetic limbs by providing the relevant metrics while the users adapt to new prosthetic technologies. Results demonstrate subject-specific sensory feedback responses and shift in amputee behavior while invasive electrical stimulation is applied.