ADLi-Net: A Learnable Dilation Deep Learning Framework for Accurate Alzheimer’s Disease Stage Classification from Brain Magnetic Resonance Images

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Alzheimer’s Disease (AD) is a leading cause of dementia because of its advanced neurodegenerative nature. However, conventional Deep Learning (DL) models that use fixed dilation are struggle to capture both fine-grained and global structural changes, thereby leading to suboptimal feature representations. In this research, a novel framework named Alzheimer’s Disease Learnable Dilation Network (ADLi-Net) is proposed, which integrates DenseNet-201 with a Learnable Atrous Spatial Pyramid Pooling (ASPP) module for multi-class AD classification using Magnetic Resonance Imaging (MRI). The learnable ASPP dynamically learns through backpropagation during training, thereby providing a model for extracting context-aware features from complex brain structures. Preprocessing techniques, such as Contrast Limited Adaptive Histogram Equalization (CLAHE) and min-max normalization are utilized to enhance image contrast and standardize pixel intensity values thereby ensuring consistent input for further process. The proposed ADLi-Net is estimated on the ADNI dataset containing over 23,000 T2-weighted MRI scans categorized into six cognitive states. The experimental results show that ADLi-Net achieves better performance, with an overall accuracy of 98.19% on ADNI dataset. The comparative analysis and ablation study ensured the efficiency of learnable dilation rates, weighted loss functions, and optimal validation strategies. Therefore, this research demonstrates that dynamically tuned multi-scale spatial filtering enhances AD classification.

Article activity feed