HybridEEGNet: Combining Spatial Attention and Temporal Convolutions with Transformer Encoders for Automated Alzheimer's Disease Detection from EEG Signals

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder affecting over 55 million people worldwide, with early detection being crucial for effective intervention and care planning. Electroencephalography (EEG) offers a non-invasive, cost-effective, and widely accessible approach for AD screening, but traditional analysis methods often fail to capture the complex spatial-temporal patterns indicative of cognitive decline. In this paper, we present HybridEEGNet, a novel deep learning architecture that synergistically combines spatial attention mechanisms, multi-scale temporal convolutions with dilated kernels, and Transformer encoders for automated AD detection from resting-state EEG recordings. Our approach addresses the unique challenges of EEG analysis through a three-stage processing pipeline: (1) spatial attention to learn channel-specific importance reflecting regional brain activity patterns, (2) multi-scale dilated convolutions to extract local temporal features across different frequency resolutions, and (3) self-attention mechanisms to capture longrange temporal dependencies characteristic of neural synchronization patterns. We conduct comprehensive experiments on the PhysioNet EEG dataset, systematically comparing our approach against multiple baseline architectures including 1D-CNN, 2D-CNN, LSTM, and pure Transformer models using consistent preprocessing and evaluation protocols. Our HybridCNNTransformer achieves state-of-the-art performance with 72.9% accuracy, 81.1% F1-score, 91.1% recall, and 69.3% ROCAUC, demonstrating significant improvements over all baselines. We provide extensive ablation studies quantifying the contribution of each architectural component, attention visualizations for model interpretability aligned with known AD biomarkers, cross-validation analysis for robustness assessment, and detailed computational efficiency comparisons. The high recall achieved by our model is particularly relevant for clinical screening applications where minimizing false negatives is critical. We discuss the clinical implications, limitations, and future directions for EEG-based AD detection. Our code, trained models, and experimental configurations are publicly available at https://github.com/YCRG-Labs/alzheimers-eeg to facilitate reproducibility and accelerate future research in this important domain.

Article activity feed