A nested attention neural hybrid model for grammatical error correction

68Citations
Citations of this article
194Readers
Mendeley users who have this article in their library.

Abstract

Grammatical error correction (GEC) systems strive to correct both global errors in word order and usage, and local errors in spelling and inflection. Further developing upon recent work on neural machine translation, we propose a new hybrid neural model with nested attention layers for GEC. Experiments show that the new model can effectively correct errors of both types by incorporating word and character-level information, and that the model significantly outperforms previous neural models for GEC as measured on the standard CoNLL-14 benchmark dataset. Further analysis also shows that the superiority of the proposed model can be largely attributed to the use of the nested attention mechanism, which has proven particularly effective in correcting local errors that involve small edits in orthography.

Cite

CITATION STYLE

APA

Ji, J., Wang, Q., Toutanova, K., Gong, Y., Truong, S., & Gao, J. (2017). A nested attention neural hybrid model for grammatical error correction. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 1, pp. 753–762). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/P17-1070

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free