Organelle-Aware Representation Learning Enables Label-Free Detection of Mitochondrial Dysfunction in Live Human Neurons

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Mitochondrial dysfunction is a convergent hallmark of neurodegenerative diseases and represents a promising biomarker for early diagnosis and therapy. However, current in vitro assays rely on fluorescence or electron microscopy, which are invasive, low-throughput, and incompatible with longitudinal analysis. Here, we present a noninvasive framework by integrating label-free optical diffraction tomography (ODT) with organelle-aware representation learning to detect subtle mitochondrial dysfunction in live human induced pluripotent stem cell (hiPSC)-derived neurons. Through virtual staining of the nuclei, lysosomes, and mitochondria, we establish two complementary and interpretable classification pipelines: a deep learning model with organelle-aware encoder and a logistic regression model on morphometric descriptors. Both models achieve approximately 85% accuracy: the deep model provides end-to-end prediction with minimal feature engineering, whereas the logistic regression model offers a more interpretable, feature-based approach. To our knowledge, this is the first demonstration of ODT-based organelle-resolved virtual staining in live human neurons, establishing a scalable, non-invasive platform for mitochondrial disease modeling, drug discovery, and neurodegeneration research.

Article activity feed