Textual entailment based question generation

1Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.

Abstract

This paper proposes a novel question generation (QG) approach based on textual entailment. Many previous QG studies transform a single sentence into a question directly. They need hand-crafted templates or generate simple questions similar to the source texts. As a novel approach to QG, this research employs two-step QG: 1) generating new texts entailed by source documents, and 2) transforming the entailed sentences into questions. This process can generate questions that need the understanding of textual entailment to solve. Our system collected 1,367 English Wikipedia sentences as QG source, retrieved 647 entailed sentences from the web, and transformed them into questions. The evaluation result showed that our system successfully generated nontrivial questions based on textual entailment with 53% accuracy.

Cite

CITATION STYLE

APA

Matsumoto, T., Hasegawa, K., Yamakawa, Y., & Mitamura, T. (2018). Textual entailment based question generation. In 2IS and NLG 2018 - Workshop on Intelligent Interactive Systems and Language Generation, Proceedings of the Workshop (pp. 15–19). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-6704

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free