Recurrent neural network with dynamic memory

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recurrent neural network presents great performance and learning capability in dealing with sensory processing, sequence learning and reinforcement learning. However, the memory ability of the RNNs is limited because of the gradient vanishing problem, the memory conflict and the very limited memory capability. In order to overcome these defects, we propose a novel RNN model, which is called RNN-DM. The proposed model has two types of memory: internal memory and external memory. The internal memory exists in the neuron unit of hidden layer, which is used to solve the memory conflict problem and reduce the influence of gradient vanishing problem during network learning. The external memory is a memory matrix, which is used to store the complex data structure and variable, it has ability to solve the limited memory capacity problem.

Cite

CITATION STYLE

APA

Bai, J., Dong, T., Liao, X., & Mu, N. (2018). Recurrent neural network with dynamic memory. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10878 LNCS, pp. 339–345). Springer Verlag. https://doi.org/10.1007/978-3-319-92537-0_39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free