Recurrent Neural Network based Models for Word Prediction

  • Ramya* M
  • et al.
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Globally, people are spending a cumulative amount of time on their mobile device, laptop, tab, desktop, etc,. for messaging, sending emails, banking, interaction through social media, and all other activities. It is necessary to cut down the time spend on typing through these devices. It can be achieved when the device can provide the user more options for what the next word might be for the current typed word. It also increases the speed of typing. In this paper, we suggest and presented a comparative study on various models like Recurrent Neural Network, Stacked Recurrent Neural Network, Long Short Term Memory network (LSTM) and Bi-directional LSTM that gives solution for the above said problem. Our primary goal is to suggest the best model among the four models to predict the next word for the given current word in English Language. Our survey says that for predicting next word RNN provide accuracy 60% and loss 40%, Stacked RNN provide accuracy 62% and loss 38%, LSTM provide accuracy 64% and loss 36% and Bidirectional LSTM provide accuracy 72% and loss 28%.

Cite

CITATION STYLE

APA

Ramya*, Ms. S., & Selvi, Dr. C. S. K. (2019). Recurrent Neural Network based Models for Word Prediction. International Journal of Recent Technology and Engineering (IJRTE), 8(4), 7433–7437. https://doi.org/10.35940/ijrte.d5313.118419

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free