A Model of Text-Enhanced Knowledge Graph Representation Learning with Mutual Attention

19Citations
Citations of this article
33Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recently, it has gained lots of interests to jointly learn the embeddings of knowledge graph (KG) and text information. However, previous work fails to incorporate the complex structural signals (from structure representation) and semantic signals (from text representation). This paper proposes a novel text-enhanced knowledge graph representation model, which can utilize textual information to enhance the knowledge representations. Especially, a mutual attention mechanism between KG and text is proposed to learn more accurate textual representations for further improving knowledge graph representation, within a unified parameter sharing semantic space. Different from conventional joint models, no complicated linguistic analysis or strict alignments between KG and text are required to train our model. Besides, the proposed model could fully incorporate the multi-direction signals. Experimental results show that the proposed model achieves the state-of-the-art performance on both link prediction and triple classification tasks, and significantly outperforms previous text-enhanced knowledge representation models.

Cite

CITATION STYLE

APA

Wang, Y., Zhang, H., Shi, G., Liu, Z., & Zhou, Q. (2020). A Model of Text-Enhanced Knowledge Graph Representation Learning with Mutual Attention. IEEE Access, 8, 52895–52905. https://doi.org/10.1109/ACCESS.2020.2981212

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free