Adaptive Spectral-Spatial Fusion Mamba A Novel Framework for Enhanced Hyperspectral Image Classification
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Hyperspectral Image (HSI) classification is a critical task in remote sensing, yet it faces significant challenges including spectral redundancy, complex spatial-spectral dependencies, and the scarcity of labeled samples. While deep learning models, especially State-Space Models such as Mamba, show promise, current approaches often employ fixed spectral transformations and may not fully capture intricate spatial-spectral relationships. To address these limitations, we propose the Adaptive Spectral-Spatial Fusion Mamba (ASSF-Mamba) framework, designed for superior HSI classification through adaptive spectral processing and enhanced context-aware spatial-spectral fusion. ASSF-Mamba integrates three novel modules: an Adaptive Spectral Projection and Decorrelation (ASPD) module for learnable spectral dimension reduction; a Contextual Mamba Fusion (CMF) module extending Mamba with multi-scale spatial attention and cross-dimensional modulation for long-range dependency capture; and a Hierarchical Feature Enhancement (HFE) module employing multi-level residual connections and adaptive gating for robust feature representation. Comprehensive experiments on benchmark datasets including Indian Pines, Kennedy Space Center, and Houston demonstrate that ASSF-Mamba consistently achieves state-of-the-art classification accuracies, significantly outperforming a wide range of baselines, including advanced Mamba-based models. Furthermore, our framework exhibits superior robustness under limited training data conditions, maintains competitive computational efficiency, and yields visually more coherent classification maps. An ablation study confirms the critical contribution of each proposed module to the overall performance.