Rectifying the problem of vanishing gradient problem using relu activation function based on blstm neural network

ISSN: 22773878
0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.

Abstract

Character reorganization is a big task to apply on Hand written documents, using keyword spotting is best solution in now a days. Here keyword spotting is playing major role to extract the character or manuscripts from unconstrained written text and recognize based on probability of the character or letter and manuscripts. It performs template free spotting with the help of CTC Token passing algorithm. the main problem here while performing back propagation for the neural network to find out the vanishing gradient problem .Reorganization rate will get down because error rate will be more, to keep on to increase the no of hidden layers that will be effect the output time of the neural network .Huge amount of data will be lose during carrying the output value from one layer to another layer through activation function. To simulate the problem with the help of the activation function like sigmoid activation function but accuracy of the character reorganization is very low. Instead of sigmoid to use rectified linear unit (ReLU) will update the neuron never reach to zero the output of the hidden layer.

Author supplied keywords

Cite

CITATION STYLE

APA

Venkateswararao, P., & Murugavalli, S. (2019). Rectifying the problem of vanishing gradient problem using relu activation function based on blstm neural network. International Journal of Recent Technology and Engineering, 8(1), 2615–2618.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free