Learning long-term dependencies in segmented memory recurrent neural networks

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Gradient descent learning algorithms for recurrent neural networks (RNNs) perform poorly on long-term dependency problems. In this paper, we propose a novel architecture called Segmented-Memory Recurrent Neural Network (SMRNN). The SMRNN is trained using an extended real time recurrent learning algorithm, which is gradient-based. We tested the SMRNN on the standard problem of information latching. Our implementation results indicate that gradient descent learning is more effective in SMRNN than in standard RNNs. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Chen, J., & Chaudhari, N. S. (2004). Learning long-term dependencies in segmented memory recurrent neural networks. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3173, 362–369. https://doi.org/10.1007/978-3-540-28647-9_61

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free