Remote Sensing Scene Graph and Knowledge Graph Matching with Parallel Walking Algorithm

7Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

In deep neural network model training and prediction, due to the limitation of GPU memory and computing resources, massive image data must be cropped into limited-sized samples. Moreover, in order to improve the generalization ability of the model, the samples need to be randomly distributed in the experimental area. Thus, the background information is often incomplete or even missing. On this condition, a knowledge graph must be applied to the semantic segmentation of remote sensing. However, although a single sample contains only a limited number of geographic categories, the combinations of geographic objects are diverse and complex in different samples. Additionally, the involved categories of geographic objects often span different classification system branches. Therefore, existing studies often directly regard all the categories involved in the knowledge graph as candidates for specific sample segmentation, which leads to high computation cost and low efficiency. To address the above problems, a parallel walking algorithm based on cross modality information is proposed for the scene graph—knowledge graph matching (PWGM). The algorithm uses a graph neural network to map the visual features of the scene graph into the semantic space of the knowledge graph through anchors and designs a parallel walking algorithm of the knowledge graph that takes into account the visual features of complex scenes. Based on the algorithm, we propose a semantic segmentation model for remote sensing. The experiments demonstrate that our model improves the overall accuracy by 3.7% compared with KGGAT (which is a semantic segmentation model using a knowledge graph and graph attention network (GAT)), by 5.1% compared with GAT and by 13.3% compared with U-Net. Our study not only effectively improves the recognition accuracy and efficiency of remote sensing objects, but also offers useful exploration for the development of deep learning from a data-driven to a data-knowledge dual drive.

Cite

CITATION STYLE

APA

Cui, W., Hao, Y., Xu, X., Feng, Z., Zhao, H., Xia, C., & Wang, J. (2022). Remote Sensing Scene Graph and Knowledge Graph Matching with Parallel Walking Algorithm. Remote Sensing, 14(19). https://doi.org/10.3390/rs14194872

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free