A Pretraining Numerical Reasoning Model for Ordinal Constrained Question Answering on Knowledge Base

7Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Knowledge Base Question Answering (KBQA) is to answer natural language questions posed over knowledge bases (KBs). This paper targets at empowering the IR-based KBQA models with the ability of numerical reasoning for answering ordinal constrained questions. A major challenge is the lack of explicit annotations about numerical properties. To address this challenge, we propose a pretraining numerical reasoning model consisting of NumGNN and NumTransformer, guided by explicit self-supervision signals. The two modules are pretrained to encode the magnitude and ordinal properties of numbers respectively and can serve as model-agnostic plugins for any IR-based KBQA model to enhance its numerical reasoning ability. Extensive experiments on two KBQA benchmarks verify the effectiveness of our method to enhance the numerical reasoning ability for IR-based KBQA models. Our code and datasets are available online.

Cite

CITATION STYLE

APA

Feng, Y., Zhang, J., He, G., Zhao, W. X., Liu, L., Liu, Q., … Chen, H. (2021). A Pretraining Numerical Reasoning Model for Ordinal Constrained Question Answering on Knowledge Base. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 1852–1861). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.159

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free