A VHR Bi-Temporal Remote-Sensing Image Change Detection Network Based on Swin Transformer

4Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Change detection (CD), as a special remote-sensing (RS) segmentation task, faces challenges, including alignment errors and illumination variation, dense small targets, and large background intraclass variance in very high-resolution (VHR) remote-sensing images. Recent methods have avoided the misjudgment caused by illumination variation and alignment errors by increasing the ability of global modeling, but the latter two problems have still not been fully addressed. In this paper, we propose a new CD model called SFCD, which increases the feature extraction capability for small targets by introducing a shifted-window (Swin) transformer. We designed a foreground-aware fusion module to use attention gates to trim low-level feature responses, enabling increased attention to the changed region compared to the background when recovering the changed region, thus reducing background interference. We evaluated our model on two CD datasets, LEVIR-CD and CDD, and obtained F1 scores of 91.78 and 97.87, respectively. The experimental results and visual interpretation show that our model outperforms several previous CD models. In addition, we adjusted the parameters and structure of the standard model to develop a lightweight version that achieves an accuracy beyond most models with only 1.55 M parameters, further validating the effectiveness of our design.

Cite

CITATION STYLE

APA

Teng, Y., Liu, S., Sun, W., Yang, H., Wang, B., & Jia, J. (2023). A VHR Bi-Temporal Remote-Sensing Image Change Detection Network Based on Swin Transformer. Remote Sensing, 15(10). https://doi.org/10.3390/rs15102645

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free