מושגי ליבה
SpectralMamba is a novel state space model-integrated deep learning framework that efficiently and effectively processes hyperspectral data for accurate image classification.
תקציר
The key highlights and insights from the content are:
The authors propose SpectralMamba, a novel state space model-integrated deep learning framework for hyperspectral image classification. SpectralMamba features efficient modeling of hyperspectral data dynamics at two levels:
a. In the spatial-spectral space, a dynamical mask is learned by efficient convolutions to simultaneously encode spatial regularity and spectral peculiarity, attenuating spectral variability and confusion.
b. In the hidden state space, the merged spectrum is efficiently operated with input-dependent parameters, yielding selectively focused responses without reliance on redundant attention or imparallelizable recurrence.
To further improve efficiency, the authors introduce a piece-wise sequential scanning mechanism that transfers the continuous hyperspectral spectrum into sequences with squeezed length, while maintaining short- and long-term contextual profiles.
Extensive experiments on four benchmark hyperspectral datasets demonstrate that SpectralMamba significantly outperforms classic network architectures like MLP, CNN, RNN, and Transformer in both classification performance and computational efficiency.
The ablation studies verify the effectiveness of the key components, such as the piece-wise sequential scanning strategy maximally bringing around 4% improvement in overall accuracy while reducing 60% parameters and 40% computations compared to the baseline.
The authors claim that SpectralMamba is the first work that well tailors the deep state space model for hyperspectral data analysis, providing a novel and efficient solution to address the challenges of high dimensionality, spectral variability, and spectral confusion in hyperspectral image classification.
סטטיסטיקה
The authors provide several key statistics and figures to support their claims:
SpectralMamba significantly outperforms classic network architectures like MLP, CNN, RNN, and Transformer in both classification performance (OA) and computational efficiency (parameters and MACs) on four benchmark hyperspectral datasets.
The piece-wise sequential scanning strategy in SpectralMamba can bring around 4% improvement in overall accuracy while reducing 60% parameters and 40% computations compared to the baseline.