Artificial Neural Networks – ICANN 2009

N/ACitations
Citations of this article
69Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A lot of attention is now being focused on connectionist models known under the name “reservoir computing”. The most prominent example of these approaches is a recurrent neural network architecture called an echo state network (ESN). ESNs were successfully applied in several time series modeling tasks and according to the authors they performed exceptionally well. Multiple enhancements to standard ESN were proposed in the literature. In this paper we follow the opposite direction by suggesting several simplifications to the original ESN architecture. ESN reservoir features contractive dynamics resulting from its’ initialization with small weights. Sometimes it serves just as a simple memory of inputs and provides only negligible “extra-value” over much simple methods. We experimentally support this claim and we show that many tasks modeled by ESNs can be handled with much simple approaches.

Cite

CITATION STYLE

APA

Alippi, C., Polycarpou, M., Panayiotou, C., & Ellinas, G. (Eds.). (2009). Artificial Neural Networks – ICANN 2009 (Vol. 5769, pp. 416–425). Berlin, Heidelberg: Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-04277-5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free