Learning a multi-branch neural network from multiple sources for knowledge adaptation in remote sensing imagery

36Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

In this paper we propose a multi-branch neural network, called MB-Net, for solving the problem of knowledge adaptation from multiple remote sensing scene datasets acquired with different sensors over diverse locations and manually labeled with different experts. Our aim is to learn invariant feature representations from multiple source domains with labeled images and one target domain with unlabeled images. To this end, we define for MB-Net an objective function that mitigates the multiple domain shifts at both feature representation and decision levels, while retaining the ability to discriminate between different land-cover classes. The complete architecture is trainable end-to-end via the backpropagation algorithm. In the experiments, we demonstrate the effectiveness of the proposed method on a new multiple domain dataset created from four heterogonous scene datasets well known to the remote sensing community, namely, the University of California (UC-Merced) dataset, the Aerial Image dataset (AID), the PatternNet dataset, and the Northwestern Polytechnical University (NWPU) dataset. In particular, this method boosts the average accuracy over all transfer scenarios up to 89.05% compared to standard architecture based only on cross-entropy loss, which yields an average accuracy of 78.53%.

Cite

CITATION STYLE

APA

Al Rahhal, M. M., Bazi, Y., Abdullah, T., Mekhalfi, M. L., AlHichri, H., & Zuair, M. (2018). Learning a multi-branch neural network from multiple sources for knowledge adaptation in remote sensing imagery. Remote Sensing, 10(12). https://doi.org/10.3390/rs10121890

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free