Non-invasive brain-machine interface control with artificial intelligence copilots

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Motor brain-machine interfaces (BMIs) decode neural signals to help people with paralysis move and communicate. Even with important advances in the last two decades, BMIs face key obstacles to clinical viability. Invasive BMIs achieve proficient cursor and robotic arm control but require neurosurgery, posing significant risk to patients. Non-invasive BMIs do not have neurosurgical risk, but achieve lower performance, sometimes being prohibitively frustrating to use and preventing widespread adoption. We take a step toward breaking this performance-risk tradeoff by building performant non-invasive BMIs. The critical limitation that bounds decoder performance in non-invasive BMIs is their poor neural signal-to-noise ratio. To overcome this, we contribute (1) a novel EEG decoding approach and (2) artificial intelligence (AI) copilots that infer task goals and aid action completion. We demonstrate that with this “AI-BMI,” in tandem with a new adaptive decoding approach using a convolutional neural network (CNN) and ReFIT-like Kalman filter (KF), healthy users and a paralyzed participant can autonomously and proficiently control computer cursors and robotic arms. Using an AI copilot improves goal acquisition speed by up to 4.3 × in the standard center-out 8 cursor control task and enables users to control a robotic arm to perform the sequential pick-and-place task, moving 4 randomly placed blocks to 4 randomly chosen locations. As AI copilots improve, this approach may result in clinically viable non-invasive AI-BMIs.

Article activity feed