TDN: An integrated representation learning model of knowledge graphs

4Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Knowledge graph (KG) is playing an important role in many artificial intelligence applications. Representation learning of KGs aims to project both entities and relations into a continuous low-dimensional space. The representation learning technique based on embedding has been used to implement the KG completion, which aims to predict potential triples (head, relation, and tail) in KG. Most current methods concentrate on learning representations based on triple information while ignoring integrating the textual knowledge and network topology of KG. This leads to ambiguous completions. To address this problem and implement more accurate KG completion, we propose a new representation learning model, TDN model, which integratedly embeds the information of triples, text descriptions, and network structure of KG in a low-dimensional vector space. The framework of TDN is defined and the methodology of implementing TDN embedding is explored. To verify the effectiveness of the proposed model, we evaluate TDN via the experiments of link prediction on the real-world datasets. The experimental results confirm the above claims and show that TDN-based embedding significantly outperforms other baselines.

Cite

CITATION STYLE

APA

Kang, X., Yao, H., Li, Q., Li, X., Liu, C., & Dong, L. (2019). TDN: An integrated representation learning model of knowledge graphs. IEEE Access, 7, 55199–55205. https://doi.org/10.1109/ACCESS.2019.2913086

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free