Abstract
Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in datalimited settings.We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.
Cite
CITATION STYLE
Li, Y., Anastasopoulos, A., & Black, A. W. (2020). Towards Minimal Supervision BERT-Based Grammar Error Correction. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 13859–13860). AAAI press.
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.