Fusing Multiseasonal Sentinel-2 Imagery for Urban Land Cover Classification with Multibranch Residual Convolutional Neural Networks

26Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Exploiting multitemporal Sentinel-2 images for urban land cover classification has become an important research topic, since these images have become globally available at relatively fine temporal resolution, thus offering great potential for large-scale land cover mapping. However, appropriate exploitation of the images needs to address problems such as cloud cover inherent to optical satellite imagery. To this end, we propose a simple yet effective decision-level fusion approach for urban land cover prediction from multiseasonal Sentinel-2 images, using the state-of-the-art residual convolutional neural networks (ResNet). We extensively tested the approach in a cross-validation manner over a seven-city study area in central Europe. Both quantitative and qualitative results demonstrated the superior performance of the proposed fusion approach over several baseline approaches, including observation-and feature-level fusion.

Cite

CITATION STYLE

APA

Qiu, C., Mou, L., Schmitt, M., & Zhu, X. X. (2020). Fusing Multiseasonal Sentinel-2 Imagery for Urban Land Cover Classification with Multibranch Residual Convolutional Neural Networks. IEEE Geoscience and Remote Sensing Letters, 17(10), 1787–1791. https://doi.org/10.1109/LGRS.2019.2953497

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free