Grammatical Error Correction with Dependency Distance

2Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Grammatical Error Correction (GEC) task is always considered as low resource machine translation task which translates a sentence in an ungrammatical language to a grammatical language. As the state-of-the-art approach to GEC task, transformer-based neural machine translation model takes input sentence as a token sequence without sentence's structure information, and may be misled by some strange ungrammatical contexts. In response, to lay more attention on a given token's correct collocation rather than the misleading tokens, we propose dependent self-attention to relatively increase the attention score between correct collocations according to the dependency distance between tokens. However, as the source sentence is ungrammatical in GEC task, the correct collocations can hardly be extracted by normal dependency parser. Therefore, we propose dependency parser for ungrammatical sentence to get the dependency distance between tokens in the ungrammatical sentence. Our method achieves competitive results on both BEA-2019 shared task, CoNLL-2014 shared task and JFLEG test sets.

Cite

CITATION STYLE

APA

Lin, H., Li, J., Zhang, X., & Chen, H. (2021). Grammatical Error Correction with Dependency Distance. In International Conference on Information and Knowledge Management, Proceedings (pp. 1018–1027). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482348

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free