Gradient descent learning algorithms for recurrent neural networks (RNNs) perform poorly on long-term dependency problems. In this paper, we propose a novel architecture called Segmented-Memory Recurrent Neural Network (SMRNN). The SMRNN is trained using an extended real time recurrent learning algorithm, which is gradient-based. We tested the SMRNN on the standard problem of information latching. Our implementation results indicate that gradient descent learning is more effective in SMRNN than in standard RNNs. © Springer-Verlag 2004.
CITATION STYLE
Chen, J., & Chaudhari, N. S. (2004). Learning long-term dependencies in segmented memory recurrent neural networks. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3173, 362–369. https://doi.org/10.1007/978-3-540-28647-9_61
Mendeley helps you to discover research relevant for your work.