Abstract
This paper describes our contribution to the low-resource track of the BEA 2019 shared task on Grammatical Error Correction (GEC). Our approach to GEC builds on the theory of the noisy channel by combining a channel model and language model. We generate confusion sets from the Wikipedia edit history and use the frequencies of edits to estimate the channel model. Additionally, we use two pretrained language models: 1) Google's BERT model, which we fine-tune for specific error types and 2) OpenAI's GPT-2 model, utilizing that it can operate with previous sentences as context. Furthermore, we search for the optimal combinations of corrections using beam search.
Cite
CITATION STYLE
Flachs, S., Lacroix, O., & Søgaard, A. (2019). Noisy channel for low resource grammatical error correction. In ACL 2019 - Innovative Use of NLP for Building Educational Applications, BEA 2019 - Proceedings of the 14th Workshop (pp. 191–196). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w19-4420
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.