Case-Based Reasoning for Natural Language Queries over Knowledge Bases

90Citations
Citations of this article
133Readers
Mendeley users who have this article in their library.

Abstract

It is often challenging to solve a complex problem from scratch, but much easier if we can access other similar problems with their solutions - a paradigm known as case-based reasoning (CBR). We propose a neuro-symbolic CBR approach (CBR-KBQA) for question answering over large knowledge bases. CBR-KBQA consists of a nonparametric memory that stores cases (question and logical forms) and a parametric model that can generate a logical form for a new question by retrieving cases that are relevant to it. On several KBQA datasets that contain complex questions, CBR-KBQA achieves competitive performance. For example, on the COMPLEXWEBQUESTIONS dataset, CBR-KBQA outperforms the current state of the art by 11% on accuracy. Furthermore, we show that CBR-KBQA is capable of using new cases without any further training: by incorporating a few human-labeled examples in the case memory, CBR-KBQA is able to successfully generate logical forms containing unseen KB entities as well as relations.

Cite

CITATION STYLE

APA

Das, R., Zaheer, M., Thai, D., Godbole, A., Perez, E., Lee, J. Y., … McCallum, A. (2021). Case-Based Reasoning for Natural Language Queries over Knowledge Bases. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 9594–9611). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.755

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free