Solving Math Word Problems with Multi-Encoders and Multi-Decoders

49Citations
Citations of this article
82Readers
Mendeley users who have this article in their library.

Abstract

Math word problems solving remains a challenging task where potential semantic and mathematical logic need to be mined from natural language. Although previous researches employ the Seq2Seq technique to transform text descriptions into equation expressions, most of them achieve inferior performance due to insufficient consideration in the design of encoder and decoder. Specifically, these models only consider input/output objects as sequences, ignoring the important structural information contained in text descriptions and equation expressions. To overcome those defects, a model with multi-encoders and multi-decoders is proposed in this paper, which combines sequence-based encoder and graph-based encoder to enhance the representation of text descriptions, and generates different equation expressions via sequence-based decoder and tree-based decoder. Experimental results on the dataset Math23K show that our model outperforms existing state-of-the-art methods.

Cite

CITATION STYLE

APA

Shen, Y., & Jin, C. (2020). Solving Math Word Problems with Multi-Encoders and Multi-Decoders. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 2924–2934). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.262

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free