Recently, hyperspectral image classification based on deep learning has achieved considerable attention. Many convolutional neural network classification methods have emerged and exhibited superior classification performance. However, most methods focus on extracting features by using fixed convolution kernels and layer-wise representation, resulting in feature extraction singleness. Additionally, the feature fusion process is rough and simple. Numerous methods get accustomed to fusing different levels of features by stacking modules hierarchically, which ignore the combination of shallow and deep spectral-spatial features. In order to overcome the preceding issues, a novel multiscale dual-branch feature fusion and attention network is proposed. Specifically, we design a multiscale feature extraction (MSFE) module to extract spatial-spectral features at a granular level and expand the range of receptive fields, thereby enhancing the MSFE ability. Subsequently, we develop a dual-branch feature fusion interactive module that integrates the residual connection's feature reuse property and the dense connection's feature exploration capability, obtaining more discriminative features in both spatial and spectral branches. Additionally, we introduce a novel shuffle attention mechanism that allows for adaptive weighting of spatial and spectral features, further improving classification performance. Experimental results on three benchmark datasets demonstrate that our model outperforms other state-of-the-art methods while incurring the lower computational cost.
CITATION STYLE
Gao, H., Zhang, Y., Chen, Z., & Li, C. (2021). A Multiscale Dual-Branch Feature Fusion and Attention Network for Hyperspectral Images Classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 14, 8180–8192. https://doi.org/10.1109/JSTARS.2021.3103176
Mendeley helps you to discover research relevant for your work.