Annotation-free whole-brain neuron tracking via transformer-based self-supervised learning

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Tracking individual neurons in dense 3D microscopy recordings of fast-moving, deforming organisms is a critically important in neuroscience. This task is often hindered by complex motion and the need for laborious manual annotation. Here, we introduce Annotation-free Self-supervised Contrastive Embeddings for Neuron Tracking (ASCENT), a computational framework that learns robust representations of neuronal identity that are invariant to tissue deformation. ASCENT trains a Neuron Embedding Transformer (NETr) through a self-supervised contrastive scheme on augmented data that mimics animal movement and tissue deformations. NETr generates highly discriminative embeddings for each neuron by combining visual features with positional information, refined by the context of all other neurons in the frame. On a benchmark freely-moving C. elegans dataset, ASCENT achieves best tracking accuracy yet, surpassing even supervised methods. We further demonstrate ASCENT’s robust performance across several additional datasets in different anatomical regions and under different imaging conditions. We show that the method is generally robust to image noises, is highly data-efficient, and generalizes across different imaging conditions. By eliminating the annotation bottleneck, ASCENT provides a highly scalable and practical solution for the analysis of whole-brain neural dynamics in behaving animals.

Article activity feed