Dynamic Preclinical Detection and Progression Prediction of Neurodegenerative Diseases Using Multi-Modal Deep Learning
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Neurodegenerative disorders such as Alzheimer’s disease (AD) and Parkinson’s disease (PD) continue to pose significant global health challenges, particularly due to their insidious progression and the critical need for early, preclinical diagnosis. In this study, we present a novel multimodal deep learning framework that integrates heterogeneous data sources omics, wearable sensor data, and environmental exposure metrics for robust disease prediction and personalized progression forecasting. Our architecture employs a Transformer based encoder for omics data, Long Short-Term Memory (LSTM) networks for temporal modeling of wearable signals, and Graph Neural Networks (GNNs) to capture spatial correlations in environmental exposures. A trainable attention-based fusion mechanism dynamically integrates modality specific embeddings to generate a unified diagnostic representation. The framework was rigorously evaluated on benchmark datasets and achieved a classification accuracy of 98%, with sensitivity, specificity, and F1-score reaching 96%, 95%, and 97%, respectively. Furthermore, the model accurately predicted therapeutic windows with 93% precision, highlighting its potential for early clinical intervention. These results demonstrate the effectiveness of multimodal integration in enhancing diagnostic precision. Future extensions will explore federated learning to enable privacy-preserving and scalable deployment in real-world healthcare systems.