Learning Universal Computations with Spikes

N/ACitations
Citations of this article
227Readers
Mendeley users who have this article in their library.

Abstract

Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them.

Cite

CITATION STYLE

APA

Thalmeier, D., Uhlmann, M., Kappen, H. J., & Memmesheimer, R. M. (2016). Learning Universal Computations with Spikes. PLoS Computational Biology, 12(6). https://doi.org/10.1371/journal.pcbi.1004895

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free