Adaptive critical reservoirs with power law forgetting of unexpected input sequences

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The echo-state condition names an upper limit for the hidden layer connectivity in recurrent neural networks. If the network is below this limit there is an injective, continuous mapping from the recent input history to the internal state of the network. Above the network becomes chaotic, the dependence on the initial state of the network may never be washed out. I focus on the biological relevance of echo state networks with a critical connectivity strength at the separation line between these two conditions and discuss some related biological findings, i.e. there is evidence that the neural connectivity in cortical slices is tuned to a critical level. In addition, I propose a model that makes use of a special learning mechanism within the recurrent layer and the input connectivity. Results show that after adaptation indeed traces of single unexpected events stay for a longer time period than exponential in the network. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Mayer, N. M. (2014). Adaptive critical reservoirs with power law forgetting of unexpected input sequences. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 49–56). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free