Controlled Language Generation for Language Learning Items

3Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

Abstract

This work aims to employ natural language generation (NLG) to rapidly generate items for English language learning applications: this requires both language models capable of generating fluent, high-quality English, and to control the output of the generation to match the requirements of the relevant items. We experiment with deep pretrained models for this task, developing novel methods for controlling items for factors relevant in language learning: diverse sentences for different proficiency levels and argument structure to test grammar. Human evaluation demonstrates high grammatically scores for all models (3.4 and above out of 4), and higher length (24%) and complexity (9%) over the baseline for the advanced proficiency model. Our results show that we can achieve strong performance while adding additional control to ensure diverse, tailored content for individual users.

Cite

CITATION STYLE

APA

Stowe, K., Ghosh, D., & Zhao, M. (2022). Controlled Language Generation for Language Learning Items. In EMNLP 2022 - Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track (pp. 304–315). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-industry.30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free