Learning to Generate Question by Asking Question: A Primal-Dual Approach with Uncommon Word Generation

6Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

Automatic question generation (AQG) is the task of generating a question from a given passage and an answer. Most existing AQG methods aim at encoding the passage and the answer to generate the question. However, limited work has focused on modeling the correlation between the target answer and the generated question. Moreover, unseen or rare word generation has not been studied in previous works. In this paper, we propose a novel approach which incorporates question generation with its dual problem, question answering, into a unified primal-dual framework. Specifically, the question generation component consists of an encoder that jointly encodes the answer with the passage, and a decoder that produces the question. The question answering component then re-asks the generated question on the passage to ensure that the target answer is obtained. We further introduce a knowledge distillation module to improve the model generalization ability. We conduct an extensive set of experiments on SQuAD and HotpotQA benchmarks. Experimental results demonstrate the superior performance of the proposed approach over several state-of-the-art methods.

Cite

CITATION STYLE

APA

Wang, Q., Yang, L., Quan, X., Feng, F., Liu, D., Xu, Z., … Ma, H. (2022). Learning to Generate Question by Asking Question: A Primal-Dual Approach with Uncommon Word Generation. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 (pp. 46–61). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-main.4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free