Numerical reasoning skills are essential for complex question answering (CQA) over text. It requires opertaions including counting, comparison, addition and subtraction. A successful approach to CQA on text, Neural Module Networks (NMNs), follows the programmer-interpreter paradigm and leverages specialised modules to perform compositional reasoning. However, the NMNs framework does not consider the relationship between numbers and entities in both questions and paragraphs. We propose effective techniques to improve NMNs' numerical reasoning capabilities by making the interpreter questionaware and capturing the relationship between entities and numbers. On the same subset of the DROP dataset for CQA on text, experimental results show that our additions outperform the original NMNs by 3.0 points for the overall F1 score.
CITATION STYLE
Guo, X. Y., Li, Y. F., & Haffari, G. (2021). Improving Numerical Reasoning Skills in the Modular Approach for Complex Question Answering on Text. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 2713–2718). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.231
Mendeley helps you to discover research relevant for your work.