Abstract
Graph neural networks (GNNs), which effectively use topological structures in the knowledge graphs (KG) to embed entities and relations in low-dimensional spaces, have shown great power in knowledge graph completion (KGC). KG has abundant global and local structural information, however, many GNN-based KGC models cannot capture these two types of information about the graph structure by designing complex aggregation schemes and are not designed well to learn representations of seen entities with sparse neighborhoods in isolated subgraphs. In this paper, we find that a simple attention-based method can outperform a general GNN-based approach for KGC. We then propose a double-branch multi-attentionbased graph neural network (MA-GNN) to learn more expressive entity representations that contain rich global-local structural information. Specifically, we first explore the graph attention network-based local aggregator to learn entity representations. Furthermore, we propose a snowball local attention mechanism by leveraging the semantic similarity between two-hop neighbors to enrich the entity embedding. Finally, we use Transformer-based self-attention to learn long-range dependence between entities to obtain richer representations with the global graph structure and entity features. Experimental results on five benchmark datasets show that MA-GNN achieves significant improvements over strong baselines for inductive KGC.
Cite
CITATION STYLE
Xu, H., Bao, J., & Liu, W. (2023). Double-Branch Multi-Attention based Graph Neural Network for Knowledge Graph Completion. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 15257–15271). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.850
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.