The fusion of synthetic aperture radar (SAR) and optical satellite data is widely used for deep learning based scene classification. Counter-intuitively such neural networks are still sensitive to changes in single data sources, which can lead to unexpected behavior and a significant drop in performance when individual sensors fail or when clouds obscure the optical image. In this paper we incorporate source-wise out-of-distribution (OOD) detection into the fusion process at test time in order to not consider unuseful or even harmful information for the prediction. As a result, we propose a modified training procedure together with an adaptive fusion approach that weights the extracted information based on the source-wise in-distribution probabilities. We evaluate the proposed approach on the BigEarthNet multilabel scene classification data set and several additional OOD test cases as missing or damaged data, clouds, unknown classes, and coverage by snow and ice. The results show a significant improvement in robustness to different types of OOD data affecting only individual data sources. At the same time the approach maintains the classification performance of the baseline approaches compared. The code for the experiments of this paper is available on GitHub: https://github.com/JakobCode/OOD_DataFusion.
CITATION STYLE
Gawlikowski, J., Saha, S., Niebling, J., & Zhu, X. X. (2023). Handling unexpected inputs: incorporating source-wise out-of-distribution detection into SAR-optical data fusion for scene classification. Eurasip Journal on Advances in Signal Processing, 2023(1). https://doi.org/10.1186/s13634-023-01008-z
Mendeley helps you to discover research relevant for your work.