End-to-end multimodal deep learning for real-time decoding of months-long neural activity from the same cells

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Long-term, stable, and real-time decoding of behavior-dependent neural dynamics from the same cells is critical for brain-computer interfaces (BCIs) and for understanding neural evolution during learning, and memory, and disease progression. Recent advances in flexible and high-density electrodes have enabled the stability required for long-term tracking but generate vast datasets that challenge existing analysis methods. Current spike sorting approaches rely heavily on manual curation and lack scalability for large-scale, real-time processing. Here, we introduce AutoSort, an end-to-end multimodal deep neural network-based method that enables real-time tracking and decoding of the same neurons over months. AutoSort uses a scalable strategy by learning deep representations from initial recordings and applying the trained model in real-time. It integrates multimodal features, including waveform features, distribution patterns, and inferred neuron spatial locations, to ensure robustness and accuracy. AutoSort outperforms existing methods in both simulated and long-term recordings, reducing computational demands by using only 10% of the time and 25% of the memory compared to conventional methods. By combining AutoSort with high-density flexible probes, we track neural dynamics in real-time during motor learning and skill acquisition over 2 months, capturing intrinsic neural manifold drift, stabilization, and post-learning representational drift. AutoSort offers a promising solution for studying long-term neural intrinsic dynamics and enabling real-time BCI decoding.

Article activity feed