Simple and Effective Relation-based Embedding Propagation for Knowledge Representation Learning

12Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Relational graph neural networks have garnered particular attention to encode graph context in knowledge graphs (KGs). Although they achieved competitive performance on small KGs, how to efficiently and effectively utilize graph context for large KGs remains an open problem. To this end, we propose the Relation-based Embedding Propagation (REP) method. It is a post-processing technique to adapt pre-trained KG embeddings with graph context. As relations in KGs are directional, we model the incoming head context and the outgoing tail context separately. Accordingly, we design relational context functions with no external parameters. Besides, we use averaging to aggregate context information, making REP more computation-efficient. We theoretically prove that such designs can avoid information distortion during propagation. Extensive experiments also demonstrate that REP has significant scalability while improving or maintaining prediction quality. Notably, it averagely brings about 10% relative improvement to triplet-based embedding methods on OGBLWikiKG2 and takes 5%-83% time to achieve comparable results as the state-of-the-art GC-OTE.

Cite

CITATION STYLE

APA

Wang, H., Dai, S., Su, W., Zhong, H., Fang, Z., Huang, Z., … Yu, D. (2022). Simple and Effective Relation-based Embedding Propagation for Knowledge Representation Learning. In IJCAI International Joint Conference on Artificial Intelligence (pp. 2755–2761). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/382

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free