Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we address the task of cloze-style multiple choice question (MCQs) distractor generation. Our study is featured by the following designs. First, we propose to formulate the cloze distractor generation as a Text2Text task. Second, we propose pseudo Kullback-Leibler Divergence for regulating the generation to consider the item discrimination index in education evaluation. Third, we explore the candidate augmentation strategy and multi-tasking training with cloze-related tasks to further boost the generation performance. Through experiments with benchmarking datasets, our best perfomring model advances the state-of-the-art result from 10.81 to 22.00 (p@1 score).

Cite

CITATION STYLE

APA

Wang, H. J., Hsieh, K. Y., Yu, H. C., Tsou, J. C., Shih, Y. A., Huang, C. H., & Fan, Y. C. (2023). Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 12477–12491). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-acl.790

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free