A relational memory-based embedding model for triple classification and search personalization

17Citations
Citations of this article
141Readers
Mendeley users who have this article in their library.

Abstract

Knowledge graph embedding methods often suffer from a limitation of memorizing valid triples to predict new ones for triple classification and search personalization problems. To this end, we introduce a novel embedding model, named R-MeN, that explores a relational memory network to encode potential dependencies in relationship triples. R-MeN considers each triple as a sequence of 3 input vectors that recurrently interact with a memory using a transformer self-attention mechanism. Thus R-MeN encodes new information from interactions between the memory and each input vector to return a corresponding vector. Consequently, R-MeN feeds these 3 returned vectors to a convolutional neural network-based decoder to produce a scalar score for the triple. Experimental results show that our proposed R-MeN obtains state-of-the-art results on SEARCH17 for the search personalization task, and on WN11 and FB13 for the triple classification task.

Cite

CITATION STYLE

APA

Nguyen, D. Q., Nguyen, T. D., & Phung, D. (2020). A relational memory-based embedding model for triple classification and search personalization. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 3429–3435). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.313

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free