Generative Transformer with Knowledge-Guided Decoding for Academic Knowledge Graph Completion

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Academic knowledge graphs are essential resources and can be beneficial in widespread real-world applications. Most of the existing academic knowledge graphs are far from completion; thus, knowledge graph completion—the task of extending a knowledge graph with missing entities and relations—attracts many researchers. Most existing methods utilize low-dimensional embeddings to represent entities and relations and follow the discrimination paradigm for link prediction. However, discrimination approaches may suffer from the scaling issue during inference with large-scale academic knowledge graphs. In this paper, we propose a novel approach of a generative transformer with knowledge-guided decoding for academic knowledge graph completion. Specifically, we introduce generative academic knowledge graph pre-training with a transformer. Then, we propose knowledge-guided decoding, which leverages relevant knowledge in the training corpus as guidance for help. We conducted experiments on benchmark datasets for knowledge graph completion. The experimental results show that the proposed approach can achieve performance gains of 30 units of the MRR score over the baselines on the academic knowledge graph AIDA.

Author supplied keywords

Cite

CITATION STYLE

APA

Liu, X., Mao, S., Wang, X., & Bu, J. (2023). Generative Transformer with Knowledge-Guided Decoding for Academic Knowledge Graph Completion. Mathematics, 11(5). https://doi.org/10.3390/math11051073

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free