Text Generation Using Long Short-Term Memory Networks

5Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The domain of natural language processing has lately achieved exceptional breakthroughs especially after the origination of the deep neural networks. This has enabled the machine learning engineers to develop such deep models that are capable of performing high-level automation, empowering computer systems to interact with the humans in a competent manner. With the usage of special types of deep neural networks known as recurrent neural networks, it is possible to accomplish various applications in the domain of natural language processing including sentiment analysis, part-of-speech tagging, machine translation, and even text generation. This paper presents a deep, stacked long short-term memory network, an advanced form of recurrent neural network model which can generate text from a random input seed. This paper discusses the shortcomings of a conventional recurrent neural network hence bringing forward the concept of long short-term memory networks along with its architecture and methodologies being adopted.

Cite

CITATION STYLE

APA

Dhall, I., Vashisth, S., & Saraswat, S. (2020). Text Generation Using Long Short-Term Memory Networks. In Lecture Notes in Networks and Systems (Vol. 106, pp. 649–657). Springer. https://doi.org/10.1007/978-981-15-2329-8_66

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free