MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering

5Citations
Citations of this article
46Readers
Mendeley users who have this article in their library.

Abstract

Knowledge Graphs (KGs) are symbolically structured storages of facts. The KG embedding contains concise data used in NLP tasks requiring implicit information about the real world. Furthermore, the size of KGs that may be useful in actual NLP assignments is enormous, and creating embedding over it has memory cost issues. We represent KG as a 3rd-order binary tensor and move beyond the standard CP decomposition (Hitchcock, 1927) by using a data-specific generalized version of it (Hong et al., 2020). The generalization of the standard CP-ALS algorithm allows obtaining optimization gradients without a backpropagation mechanism. It reduces the memory needed in training while providing computational benefits. We propose a MEKER, a memory-efficient KG embedding model, which yields SOTA-comparable performance on link prediction tasks and KG-based Question Answering.

Cite

CITATION STYLE

APA

Chekalina, V., Razzhigaev, A., Panchenko, A., Sayapin, A., & Frolov, E. (2022). MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 355–365). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-srw.27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free