Improving long-term online prediction with decoupled extended Kalman filters

5Citations
Citations of this article
34Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Long Short-Term Memory (LSTM) recurrent neural networks (RNNs) outperform traditional RNNs when dealing with sequences involving not only short-term but also long-term dependencies. The decoupled extended Kalman filter learning algorithm (DEKF) works well in online environments and reduces significantly the number of training steps when compared to the standard gradient-descent algorithms. Previous work on LSTM, however, has always used a form of gradient descent and has not focused on true online situations. Here we combine LSTM with DEKF and show that this new hybrid improves upon the original learning algorithm when applied to online processing. © Springer-Verlag Berlin Heidelberg 2002.

Cite

CITATION STYLE

APA

Pérez-Ortiz, J. A., Schmidhuber, J., Gers, F. A., & Eck, D. (2002). Improving long-term online prediction with decoupled extended Kalman filters. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2415 LNCS, pp. 1055–1060). Springer Verlag. https://doi.org/10.1007/3-540-46084-5_171

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free