Analysis of reservoir computing focusing on the spectrum of bistable delayed dynamical systems

0Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Reservoir computing (RC) is a machine-learning paradigm that is capable to process empirical time series data. This paradigm is based on a neural network with a fixed hidden layer having a high-dimensional state space, called a reservoir. Reservoirs including time delays are considered to be good candidates for practical applications because they make hardware realization of the high-dimensional reservoirs simple. Performance of the well-trained RCs depends both on dynamical properties of attractors of the reservoirs and tasks they solve. Therefore, in the conventional monostable RCs, there arise task-wise optimization problems of the reservoirs, which have been solved based on trial and error approaches. In this study, we analyzed the relationship between the dynamical properties of the time-delay reservoir and the performance in terms of the spectra of the delayed dynamical systems, which might facilitate the development of the unified systematic optimization techniques for the time-delay reservoirs. In addition, we propose a novel RC framework that performs well on distinct tasks without the task-wise optimization using bistable reservoir dynamics, which can reduce complicated hardware management of the reservoirs.

Cite

CITATION STYLE

APA

Kinoshita, I., Akao, A., Shirasaka, S., Kotani, K., & Jimbo, Y. (2019). Analysis of reservoir computing focusing on the spectrum of bistable delayed dynamical systems. Electronics and Communications in Japan, 102(2), 15–20. https://doi.org/10.1002/ecj.12142

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free