Self-generated off-line memory reprocessing strongly improves generalization in a hierarchical recurrent neural network

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Strong experimental evidence suggests that cortical memory traces are consolidated during off-line memory reprocessing that occurs in the off-line states of sleep or waking rest. It is unclear, what plasticity mechanisms are involved in this process and what changes are induced in the network in the off-line regime. Here, we examine a hierarchical recurrent neural network that performs unsupervised learning on natural face images of different persons. The proposed network is able to self-generate memory replay while it is decoupled from external stimuli. Remarkably, the recognition performance is tremendously boosted after this off-line regime specifically for the novel face views that were not shown during the initial learning. This effect is independent of synapse-specific plasticity, relying completely on homeostatic regulation of intrinsic excitability. Comparing a purely feed-forward network configuration with the full version reveals a substantially stronger boost in recognition performance for the fully recurrent network architecture after the off-line regime. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Jitsev, J. (2014). Self-generated off-line memory reprocessing strongly improves generalization in a hierarchical recurrent neural network. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 659–666). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_83

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free