A Sensor-Driven Extended Reality System for Pre-Prosthetic Kinesthetic Learning in Upper-Limb Amputees

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The functional integration of the upper-limb prosthesis is critical for long-term user satisfaction, yet high rates of device abandonment persist. Primary factors contributing to this trend are high cognitive load and difficulties associated with learning muscle control. To address these challenges, a proposal for the development and preliminary evaluation of an Extended Reality (XR) training scenario is presented. The prototype uses an adaptation of a PPG sensor to measure residual limb muscle activity, mapping these signals to control a virtual prosthetic hand. The XR environment represents a controlled platform for trainees to practice gripping in a variety of virtual objects. The approach allows real-time biofeedback enhancing control for the user, aiming to establish a more effective training to improve the adoption and functional outcomes of upper-limb prostheses.

Article activity feed