An analytical approach to single node delay-coupled reservoir computing

8Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Reservoir computing has been successfully applied in difficult time series prediction tasks by injecting an input signal into a spatially extended reservoir of nonlinear subunits to perform history-dependent nonlinear computation. Recently, the network was replaced by a single nonlinear node, delay-coupled to itself. Instead of a spatial topology, subunits are arrayed in time along one delay span of the system. As a result, the reservoir exists only implicitly in a single delay differential equation, numerical solving of which is costly. We derive here approximate analytical equations for the reservoir by solving the underlying system explicitly. The analytical approximation represents the system accurately and yields comparable performance in reservoir benchmark tasks, while reducing computational costs by several orders of magnitude. This has important implications with respect to electronic realizations of the reservoir and opens up new possibilities for optimization and theoretical investigation. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Schumacher, J., Toutounji, H., & Pipa, G. (2013). An analytical approach to single node delay-coupled reservoir computing. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8131 LNCS, pp. 26–33). https://doi.org/10.1007/978-3-642-40728-4_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free