Trainable hardware for dynamical computing using error backpropagation through physical media

69Citations
Citations of this article
100Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Neural networks are currently implemented on digital Von Neumann machines, which do not fully leverage their intrinsic parallelism. We demonstrate how to use a novel class of reconfigurable dynamical systems for analogue information processing, mitigating this problem. Our generic hardware platform for dynamic, analogue computing consists of a reciprocal linear dynamical system with nonlinear feedback. Thanks to reciprocity, a ubiquitous property of many physical phenomena like the propagation of light and sound, the error backpropagation-a crucial step for tuning such systems towards a specific task-can happen in hardware. This can potentially speed up the optimization process significantly, offering important benefits for the scalability of neuro-inspired hardware. In this paper, we show, using one experimentally validated and one conceptual example, that such systems may provide a straightforward mechanism for constructing highly scalable, fully dynamical analogue computers.

Cite

CITATION STYLE

APA

Hermans, M., Burm, M., Van Vaerenbergh, T., Dambre, J., & Bienstman, P. (2015). Trainable hardware for dynamical computing using error backpropagation through physical media. Nature Communications, 6. https://doi.org/10.1038/ncomms7729

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free