Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

We propose KGT5-context, a simple sequence-to-sequence model for link prediction (LP) in knowledge graphs (KG). Our work expands on KGT5, a recent LP model that exploits textual features of the KG, has small model size, and is scalable. To reach good predictive performance, however, KGT5 relies on an ensemble with a knowledge graph embedding model, which itself is excessively large and costly to use. In this short paper, we show empirically that adding contextual information—i.e., information about the direct neighborhood of the query entity—alleviates the need for a separate KGE model to obtain good performance. The resulting KGT5-context model is simple, reduces model size significantly, and obtains state-of-the-art performance in our experimental study.

Cite

CITATION STYLE

APA

Kochsiek, A., Saxena, A., Nair, I., & Gemulla, R. (2023). Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 131–138). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.repl4nlp-1.11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free