Scaled Custom Attention for Enhanced Temporal Dependency Modeling in EEG Classification
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Accurate Electroencephalography (EEG) classification is essential for diagnosing brain disorders such as Epilepsy. Whereas Deep Learning models such as Convolution Neural Networks (CNNs) and Long Short Term Memory (LSTM) improved EEG classification performance over traditional methods, existing attention mechanisms such as Additive, Luong and Multi-head struggle to capture EEG’s complex temporal dependencies. This study proposes Scaled Custom Attention (SCA); a mechanism for temporal dependency modeling during EEG classification. Unlike traditional Query-Key-Value (QKV) approaches which rely on semantic weighting schemes, SCA employs direct feature weighting strategy that adapts to the unique temporal dependencies of EEG signals, and introduces a scaling strategy that enhances stability. To validate our approach, experiments were conducted using TUH EEG Epilepsy Corpus (TUEP) where SCA achieved compelling classification accuracy (98.17%), surpassing Additive (96.47%), Multihead (97.65%), and Luong (97.26%) attention mechanisms when integrated to the LConvNet EEG classification model. Additionally, SCA demonstrated strong scalability, parameter efficiency, and generalization abilities, making it a promising enhancement for EEG-based deep learning models.