Traditional Recurrent Neural Networks (RNNs) perform poorly on learning tasks involving long time-lag dependencies. More recent approaches such as LSTM and its variants significantly improve on RNNs ability to learn this type of problem. We present an alternative approach to encoding temporal dependencies that associates temporal features with nodes rather than state values, where the nodes explicitly encode dependencies over variable time delays. We show promising results comparing the network's performance to LSTM variants on an extended Reber grammar task. © Springer-Verlag Berlin Heidelberg 2009.
CITATION STYLE
Johnson, K., & MacNish, C. (2009). A novel connectionist network for solving long time-lag prediction tasks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5866 LNAI, pp. 557–566). https://doi.org/10.1007/978-3-642-10439-8_56
Mendeley helps you to discover research relevant for your work.