Incorporating Global Information in Local Attention for Knowledge Representation Learning

15Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.

Abstract

Graph Attention Networks (GATs) have proven a promising model that takes advantage of localized attention mechanism to perform knowledge representation learning (KRL) on graph-structure data, e.g., Knowledge Graphs (KGs). While such approaches model entities' local pairwise importance, they lack the capability to model global importance relative to other entities of KGs. This causes such models to miss critical information in tasks where global information is also a significant component for the task, such as in knowledge representation learning. To address the issue, we allow the proper incorporation of global information into the GAT family of models through the use of scaled entity importance, which is calculated by an attention-based global random walk algorithm. In the context of KRL, incorporating global information boosts performance significantly. Experimental results on KG entity prediction against the state-of-the-arts sufficiently demonstrate the effectiveness of our proposed model.

Cite

CITATION STYLE

APA

Zhao, Y., Zhou, H., Xie, R., Zhuang, F., Li, Q., & Liu, J. (2021). Incorporating Global Information in Local Attention for Knowledge Representation Learning. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1341–1351). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.115

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free