Towards Minimal Supervision BERT-Based Grammar Error Correction

ArXiv: 2001.03521
5Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in datalimited settings.We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.

Cite

CITATION STYLE

APA

Li, Y., Anastasopoulos, A., & Black, A. W. (2020). Towards Minimal Supervision BERT-Based Grammar Error Correction. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 13859–13860). AAAI press.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free