Implicit Relation Linking for Question Answering over Knowledge Graph

3Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

Abstract

Relation linking (RL) is a vital module in knowledge-based question answering (KBQA) systems. It aims to link the relations expressed in natural language (NL) to the corresponding ones in knowledge graph (KG). Existing methods mainly rely on the textual similarities between NL and KG to build relation links. Due to the ambiguity of NL and the incompleteness of KG, many relations in NL are implicitly expressed, and may not link to a single relation in KG, which challenges the current methods. In this paper, we propose an implicit RL method called ImRL, which links relation phrases in NL to relation paths in KG. To find proper relation paths, we propose a novel path ranking model that aligns not only textual information in the word embedding space but also structural information in the KG embedding space between relation phrases in NL and relation paths in KG. Besides, we leverage a gated mechanism with attention to inject prior knowledge from external paraphrase dictionaries to address the relation phrases with vague meaning. Our experiments on two benchmark and a newly-created datasets show that ImRL significantly outperforms several state-of-the-art methods, especially for implicit RL.

Cite

CITATION STYLE

APA

Zhao, Y., Huang, J., Hu, W., Chen, Q., Qiu, X., Huo, C., & Ren, W. (2022). Implicit Relation Linking for Question Answering over Knowledge Graph. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 3956–3968). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-acl.312

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free