Automatic Question Generation u sing Sequence to Sequence RNN Model

  • et al.
N/ACitations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Automatic Question Generation (AQG) has recently received growing focus in the processing of natural language (NLP). This attempts to create questions from a text paragraph, where certain sub-spans of the passage in question will answer the questions produced.. Traditional methods predominantly use rigid heuristic rules to turn a sentence into related questions. In this research, we suggest using the neural encoder-decoder model to produce substantive and complex questions from the sentences of natural language. We apply a attention-based sequence to sequence learning paradigm for the task and analyze the impact of encoding sentence vs. knowledge at paragraph level. Information retrieval and NLP are the core components of AQG. It incorporates the application of production rules; recurrent neural network (RNN) based encoder-decoder sequence to sequence (seq2seq) models, and other intelligent techniques. RNN is used because of its long short term memory power (LSTM).The proposed system focus on generating factual WH type questions.

Cite

CITATION STYLE

APA

Jayarajan*, A. K., A., A. P., & Sunny, A. (2020). Automatic Question Generation u sing Sequence to Sequence RNN Model. International Journal of Innovative Technology and Exploring Engineering, 9(5), 1799–1803. https://doi.org/10.35940/ijitee.e2675.039520

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free