Memory capacity of input-driven echo state networks at the edge of chaos

15Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Reservoir computing provides a promising approach to efficient training of recurrent neural networks, by exploiting the computational properties of the reservoir structure. Various approaches, ranging from suitable initialization to reservoir optimization by training have been proposed. In this paper we take a closer look at short-term memory capacity, introduced by Jaeger in case of echo state networks. Memory capacity has recently been investigated with respect to criticality, the so called edge of chaos, when the network switches from a stable regime to an unstable dynamic regime. We calculate memory capacity of the networks for various input data sets, both random and structured, and show how the data distribution affects the network performance. We also investigate the effect of reservoir sparsity in this context. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Barančok, P., & Farkaš, I. (2014). Memory capacity of input-driven echo state networks at the edge of chaos. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 41–48). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free