multiVIB: A unified probabilistic contrastive learning framework for atlas-scale integration of single-cell multi-omics data

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Comprehensive brain cell atlases are essential for understanding neural functions and enabling translational insights. As single-cell technologies proliferate across experimental platforms, species, and modalities, these atlases must scale accordingly, calling for data integration framework that aligns heterogeneous datasets without erasing biologically meaningful variations. Existing tools typically address narrow integration settings, forcing researchers to assemble ad hoc workflows that may generate artifacts. Here, we introduce multiVIB, a unified probabilistic contrastive learning framework that handles diverse integration scenarios. We show that multiVIB achieves state-of-the-art performance while mitigating spurious alignments. Applied to atlas-scale datasets from the BRAIN Initiative, multiVIB demonstrates robust and scalable integration, including integration of diverse data modalities and reliable preservation of species-specific variations in cross-species integration. These capabilities position multiVIB as a scalable, biologically faithful foundation for constructing next-generation brain cell atlases with the growing landscape of single-cell data.

Article activity feed