Remote sensing imagery super resolution based on adaptive multi‐scale feature fusion network

30Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.

Abstract

Due to increasingly complex factors of image degradation, inferring high‐frequency details of remote sensing imagery is more difficult compared to ordinary digital photos. This paper proposes an adaptive multi‐scale feature fusion network (AMFFN) for remote sensing image super‐resolution. Firstly, the features are extracted from the original low‐resolution image. Then several adaptive multi‐scale feature extraction (AMFE) modules, the squeeze‐and‐excited and adaptive gating mechanisms are adopted for feature extraction and fusion. Finally, the sub‐pixel convolution method is used to reconstruct the high‐resolution image. Experiments are performed on three datasets, the key characteristics, such as the number of AMFEs and the gating connection way are studied, and super‐resolution of remote sensing imagery of different scale factors are qualitatively and quantitatively analyzed. The results show that our method outperforms the classic methods, such as Super‐Resolution Convolutional Neural Network(SRCNN), Efficient Sub‐Pixel Convolutional Network (ESPCN), and multi‐scale residual CNN(MSRN).

Cite

CITATION STYLE

APA

Wang, X., Wu, Y., Ming, Y., & Lv, H. (2020). Remote sensing imagery super resolution based on adaptive multi‐scale feature fusion network. Sensors (Switzerland), 20(4). https://doi.org/10.3390/s20041142

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free