Abstract
Deep Neural Networks (DNNs) have provided admirable results on difficult learning tasks because of its powerful model. Whenever large data sets are available for preparing training set DNNs works very well as they can be used to process and generate sequences (questions) to sequences (answers). In this paper, we have worked on end-to-end approach related to a regular sequence-learning model that makes minimal presumption on the sequence structure and makes use of our processed data set. Our method makes use of multi-layered Long Short-Term Memory (LSTM) and attention mechanism. Our experimental result is based on a particular set of chat implementation from twitter data set and Cornell movie dialog corpus. The available size of meaningful data is confined, therefore the response time is limited, however the LSTM did not find any difficulty in handling long sentences.
Cite
CITATION STYLE
Sharma, D., Samad, A., & Dev, D. (2019). Analysis of chatbot data generation using lstm. International Journal of Innovative Technology and Exploring Engineering, 8(6), 601–603.
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.