Knowledge graph embedding is an essential problem in knowledge extraction. Recently, translation based embedding models (e.g., TransE) have received increasingly attentions. These methods try to interpret the relations among entities as translations from head entity to tail entity and achieve promising performance on knowledge graph completion. Previous researchers attempt to transform the entity embedding concerning the given relation for distinguishability. Also, they naturally think the relation-related transforming should reflect attention mechanism, which means it should focus on only a part of the attributes. However, we found previous methods are failed with creating attention mechanism, and the reason is that they ignore the hierarchical routine of human cognition. When predicting whether a relation holds between two entities, people first check the category of entities, then they focus on fined-grained relation-related attributes to make the decision. In other words, the attention should take effect on entities filtered by the right category. In this paper, we propose a novel knowledge graph embedding method named TransAt to learn the translation based embedding, relation-related categories of entities and relationrelated attention simultaneously. Extensive experiments show that our approach outperforms stateof-the-art methods significantly on public datasets, and our method can learn the true attention varying among relations.
CITATION STYLE
Qian, W., Fu, C., Zhu, Y., Cai, D., & He, X. (2018). Translating embeddings for Knowledge graph completion with relation attention mechanism. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 4286–4292). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/596
Mendeley helps you to discover research relevant for your work.