Sparse neural networks enable low-power, implantable neural interfaces

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Recent advances in brain-machine interfaces (BMIs) using neural network decoders and increased channel count have improved the restoration of speech and motor function, but at the cost of higher power consumption. For wireless, implantable BMIs to be clinically viable, power consumption must be limited to prevent thermal tissue damage and enable long use without frequent charging. Here, we show how neural network “pruning” creates sparse decoders that require fewer computations and active channels for reduced power consumption. Across multiple movement decoding tasks using brain and muscle signals, recurrent neural network decoders can be compressed by over 100x while maintaining strong performance, enabling decoding on the implant with <1% power increase compared to decoding externally. Pruning also allows for deactivating up to 89% of channels, reducing BMI power by up to 5x. Counterintuitively, our findings suggest that BMIs employing a subset of a larger number of channels may achieve lower power consumption than BMIs with fewer channels, for a given performance level. These results suggest a path toward power-efficient, implantable BMIs suitable for long-term clinical use.

Article activity feed