FB-ACMCSP: A Filter-Bank Adaptive Multi-Class Common Spatial Pattern Framework for Cross-Subject Motor Imagery EEG Classification
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Inter-subject variability remains the principal challenge for zero-calibration brain-computer interface (BCI) deployment, where motor imagery (MI) EEG classifiers must generalize to completely unseen users without individual calibration. Existing Common Spatial Pattern (CSP) variants either ignore population-level statistics or fail to capture spectral diversity across the mu and beta frequency bands. This paper proposes FB-ACMCSP, a Filter-Bank Adaptive Multi-Class Common Spatial Pattern framework that combines: (1) filter-bank decomposition across nine overlapping sub-bands (8-30 Hz) to capture subject-specific spectral patterns; and (2) per-band adaptive covariance fusion that balances subject-specific and population-level covariance statistics via a fixed fusion parameter (alpha = 0.5). Preceded by Euclidean Alignment (EA) preprocessing, FB-ACMCSP is evaluated under a strict Leave-One-Subject-Out (LOSO) cross-subject protocol on the BCI Competition IV Dataset 2a (nine subjects, four MI classes). FB-ACMCSP achieves 41.28% mean LOSO accuracy and 69.64% within-subject 5-fold CV accuracy — outperforming CSP (38.46%), ACMCSP (38.93%), RCSP (38.89%), and Riemannian MDM (40.70%) — with consistent directional improvements in 6-7 of 9 subjects per comparison. The complete pipeline operates below 50 ms per trial on standard CPU hardware without GPU requirements, confirming real-time deployment suitability. All source code is publicly available at https://github.com/fouadchouag/FB-ACMCSP.