Inductively Representing Out-of-Knowledge-Graph Entities by Optimal Estimation Under Translational Assumptions

7Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Conventional Knowledge Graph Completion (KGC) assumes that all test entities appear during training. However, in real-world scenarios, Knowledge Graphs (KG) evolve fast with out-of-knowledge-graph (OOKG) entities added frequently, and we need to efficiently represent these entities. Most existing Knowledge Graph Embedding (KGE) methods cannot represent OOKG entities without costly retraining on the whole KG. To enhance efficiency, we propose a simple and effective method that inductively represents OOKG entities by their optimal estimation under translational assumptions. Moreover, given pretrained embeddings of the in-knowledge-graph (IKG) entities, our method even needs no additional learning. Experimental results on two KGC tasks with OOKG entities show that our method outperforms the previous methods by a large margin with higher efficiency.

Cite

CITATION STYLE

APA

Dai, D., Zheng, H., Luo, F., Yang, P., Chang, B., & Sui, Z. (2021). Inductively Representing Out-of-Knowledge-Graph Entities by Optimal Estimation Under Translational Assumptions. In RepL4NLP 2021 - 6th Workshop on Representation Learning for NLP, Proceedings of the Workshop (pp. 83–89). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.repl4nlp-1.10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free