Knowledge graph (KG) entity typing aims at inferring possible missing entity type instances in KG, which is a very significant but still under-explored subtask of knowledge graph completion. In this paper, we propose a novel approach for KG entity typing which is trained by jointly utilizing local typing knowledge from existing entity type assertions and global triple knowledge from KGs. Specifically, we present two distinct knowledge-driven effective mechanisms of entity type inference. Accordingly, we build two novel embedding models to realize the mechanisms. Afterward, a joint model with them is used to infer missing entity type instances, which favors inferences that agree with both entity type instances and triple knowledge in KGs. Experimental results on two real-world datasets (Freebase and YAGO) demonstrate the effectiveness of our proposed mechanisms and models for improving KG entity typing. The source code and data of this paper can be obtained from: https://github.com/Adam1679/ConnectE.
CITATION STYLE
Zhao, Y., Zhang, A., Xie, R., Liu, K., & Wang, X. (2020). Connecting embeddings for knowledge graph entity typing. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 6419–6428). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.572
Mendeley helps you to discover research relevant for your work.