Decoding intended speech with an intracortical brain-computer interface in a person with longstanding anarthria and locked-in syndrome

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Intracortical brain-computer interfaces (iBCIs) for decoding intended speech have provided individuals with ALS and severe dysarthria an intuitive method for high-throughput communication. These advances have been demonstrated in individuals who are still able to vocalize and move speech articulators. Here, we decoded intended speech from an individual with longstanding anarthria, locked-in syndrome, and ventilator dependence due to advanced symptoms of ALS. We found that phonemes, words, and higher-order language units could be decoded well above chance. While sentence decoding accuracy was below that of demonstrations in participants with dysarthria, we are able to attain an extensive characterization of the neural signals underlying speech in a person with locked-in syndrome and through our results identify several directions for future improvement. These include closed-loop speech imagery training and decoding linguistic (rather than phonemic) units from neural signals in middle precentral gyrus. Overall, these results demonstrate that speech decoding from motor cortex may be feasible in people with anarthria and ventilator dependence. For individuals with longstanding anarthria, a purely phoneme-based decoding approach may lack the accuracy necessary to support independent use as a primary means of communication; however, additional linguistic information embedded within neural signals may provide a route to augment the performance of speech decoders.

Article activity feed