Traditional Question Generation (TQG) aims to generate a question given an input passage and an answer. When there is a sequence of answers, we can perform Sequential Question Generation (SQG) to produce a series of interconnected questions. Since the frequently occurred information omission and coreference between questions, SQG is rather challenging. Prior works regarded SQG as a dialog generation task and recurrently produced each question. However, they suffered from problems caused by error cascades and could only capture limited context dependencies. To this end, we generate questions in a semi-autoregressive way. Our model divides questions into different groups and generates each group of them in parallel. During this process, it builds two graphs focusing on information from passages, answers respectively and performs dual-graph interaction to get information for generation. Besides, we design an answer-aware attention mechanism and the coarse-to-fine generation scenario. Experiments on our new dataset containing 81.9K questions show that our model substantially outperforms prior works.
CITATION STYLE
Chai, Z., & Wan, X. (2020). Learning to ask more: Semi-autoregressive sequential question generation under dual-graph interaction. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 225–237). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.21
Mendeley helps you to discover research relevant for your work.