Clozer: Adaptable Data Augmentation for Cloze-style Reading Comprehension

1Citations
Citations of this article
41Readers
Mendeley users who have this article in their library.

Abstract

Task-adaptive pre-training (TAPT) alleviates the lack of labelled data and provides performance lift by adapting unlabelled data to downstream task. Unfortunately, existing adaptations mainly involve deterministic rules that cannot generalize well. Here, we propose Clozer, a sequence-tagging based cloze answer extraction method used in TAPT that is extendable for adaptation on any cloze-style machine reading comprehension (MRC) downstream tasks. We experiment on multiple-choice cloze-style MRC tasks, and show that Clozer performs significantly better compared to the oracle and state-of-the-art in escalating TAPT effectiveness in lifting model performance, and prove that Clozer is able to recognize the gold answers independently of any heuristics.

Cite

CITATION STYLE

APA

Lovenia, H., Wilie, B., Chung, W., Zeng, M., Cahyawijaya, S., Dan, S., & Fung, P. (2022). Clozer: Adaptable Data Augmentation for Cloze-style Reading Comprehension. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 60–66). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.repl4nlp-1.7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free