Re-encoding of associations by recurrent plasticity increases memory capacity

3Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representation of associative memories, by keeping them sparse, uncorrelated and non-redundant. Here, we use a model of sequence memory to illustrate how plasticity allows a recurrent network to self-optimize by gradually re-encoding the representation of its memory items. A learning rule is used to sparsify large patterns, i.e., patterns with many active units. As a result, pattern sizes become more homogeneous, which increases the network's dynamical stability during sequence recall and allows more patterns to be stored. Last, we show that the learning rule allows for online learning in that it keeps the network in a robust dynamical steady state while storing new memories and overwriting old ones © 2014 Medina and Leibold.

Cite

CITATION STYLE

APA

Medina, D., & Leibold, C. (2014). Re-encoding of associations by recurrent plasticity increases memory capacity. Frontiers in Synaptic Neuroscience, 6(JUN). https://doi.org/10.3389/fnsyn.2014.00013

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free