Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis

12Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Knowledge Graph Embeddings (KGEs) have been intensively explored in recent years due to their promise for a wide range of applications. However, existing studies focus on improving the final model performance without acknowledging the computational cost of the proposed approaches, in terms of execution time and environmental impact. This paper proposes a simple yet effective KGE framework which can reduce the training time and carbon footprint by orders of magnitudes compared with state-of-the-art approaches, while producing competitive performance. We highlight three technical innovations: full batch learning via relational matrices, closed-form Orthogonal Procrustes Analysis for KGEs, and non-negative-sampling training. In addition, as the first KGE method whose entity embeddings also store full relation information, our trained models encode rich semantics and are highly interpretable. Comprehensive experiments and ablation studies involving 13 strong baselines and two standard datasets verify the effectiveness and efficiency of our algorithm.

Cite

CITATION STYLE

APA

Peng, X., Chen, G., Lin, C., & Stevenson, M. (2021). Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 2364–2375). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.187

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free