Text Generation using Neural Models

N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The use of automatically generated summaries for long/short texts is commonly used in digital services. In this Paper, a successful approach at text generation using generative adversarial networks (GAN) has been studied. In this paper, we have studied various neural models for text generation. Our main focus was on generating text using Recurrent Neural Network (RNN) and its variants and analyze its result. We have generated and translated text varying number of epochs and temperature to improve the confidence of the model as well as by varying the size of input file. We were amazed to see how the Long-Short Term Memory (LSTM) model responded to these varying parameters. The performance of LSTMs was better when the appropriate size of dataset was given to the model for training. The resulting model is tested on different datasets originating of varying sizes. The evaluations show that the output generated by the model do not correlate with the corresponding datasets which means that the generated output is different from the dataset.

Cite

CITATION STYLE

APA

Text Generation using Neural Models. (2019). International Journal of Innovative Technology and Exploring Engineering, 9(2S), 19–23. https://doi.org/10.35940/ijitee.b1006.1292s19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free