Reasoning about tabular information presents unique challenges to modern NLP approaches which largely rely on pre-trained contextualized embeddings of text. In this paper, we study these challenges through the problem of tabular natural language inference. We propose easy and effective modifications to how information is presented to a model for this task. We show via systematic experiments that these strategies substantially improve tabular inference performance.
CITATION STYLE
Neeraja, J., Gupta, V., & Srikumar, V. (2021). Incorporating External Knowledge to Enhance Tabular Reasoning. In NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 2799–2809). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.naacl-main.224
Mendeley helps you to discover research relevant for your work.