Optimal learning rules for discrete synapses

36Citations
Citations of this article
91Readers
Mendeley users who have this article in their library.

Abstract

There is evidence that biological synapses have a limited number of discrete weight states. Memory storage with such synapses behaves quite differently from synapses with unbounded, continuous weights, as old memories are automatically overwritten by new memories. Consequently, there has been substantial discussion about how this affects learning and storage capacity. In this paper, we calculate the storage capacity of discrete, bounded synapses in terms of Shannon information. We use this to optimize the learning rules and investigate how the maximum information capacity depends on the number of synapses, the number of synaptic states, and the coding sparseness. Below a certain critical number of synapses per neuron (comparable to numbers found in biology), we find that storage is similar to unbounded, continuous synapses. Hence, discrete synapses do not necessarily have lower storage capacity. © 2008 Barrett, van Rossum.

Cite

CITATION STYLE

APA

Barrett, A. B., & Van Rossum, M. C. W. (2008). Optimal learning rules for discrete synapses. PLoS Computational Biology, 4(11). https://doi.org/10.1371/journal.pcbi.1000230

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free