AprilE: Attention with Pseudo Residual Connection for Knowledge Graph Embedding

10Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.

Abstract

Knowledge graph embedding maps entities and relations into low-dimensional vector space. However, it is still challenging for many existing methods to model diverse relational patterns, especially symmetric and antisymmetric relations. To address this issue, we propose a novel model, AprilE, which employs triple-level self-attention and pseudo residual connection to model relational patterns. The triple-level self-attention treats head entity, relation, and tail entity as a sequence and captures the dependency within a triple. At the same time the pseudo residual connection retains primitive semantic features. Furthermore, to deal with symmetric and antisymmetric relations, two schemas of score function are designed via a position-adaptive mechanism. Experimental results on public datasets demonstrate that our model can produce expressive knowledge embedding and significantly outperforms most of the state-of-the-art works.

Cite

CITATION STYLE

APA

Liu, Y., Wang, P., Li, Y., Shao, Y., & Xu, Z. (2020). AprilE: Attention with Pseudo Residual Connection for Knowledge Graph Embedding. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 508–518). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free