Hands-Free Control of an Assistive Robotic Arm for High-Level Paralysis

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background Recent advancements in assistive robotic arms have enabled many tetraplegics to perform activities of daily living more independently. Because these systems typically require hand use, they are not a ready option for many high-level (C4 and above) tetraplegics. Such individuals, however, might be able to use signals that arise from the head and neck to control assistive devices. Therefore, the goal of the study was to evaluate the utility of several signals arising from the head and neck to control a robotic arm during 3D center-out reaching to multiple targets ~ 25–50 cm from the start location. Methods. Ten able-bodied human subjects were tested using five non-invasive, hands-free modalities (head position, head velocity, facial electromyography, tongue, and voice) to control the robot arm. For comparison, subjects also used joystick position and joystick velocity methods to control reaching movements of the robotic arm. A one-way repeated measures ANOVA was carried out on key performance indicators including movement time, path efficiency, throughput, and perceived workload. Results . The hands-free control modalities of head position, facial EMG, and tongue had average (± SD) movement times (5.8 ± 1.6, 8.2 ± 3.7, and 6.3 ± 2.0 s) that were not significantly different than that of the benchmark hand position control of a joystick (6.3 ± 2.3 s). Furthermore, no significant differences were revealed in perceived workload across control modalities. Conclusions These results indicate, therefore, that various non-invasive, hands-free methods could be used effectively by high-level tetraplegics to operate assistive robotic arms.

Article activity feed