On Forgetful Attractor Network Memories

  • Lansner A
  • Sandberg A
  • Petersson K
  • et al.
N/ACitations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A realtime online learning system with capacity limits needs to gradually forget old information in order to avoid catastrophic forgetting. This can be achieved by allowing new information to overwrite old, as in a so-called palimpsest memory. This paper describes an incremental learning rule based on the Bayesian confidence propagation neural network that has palimpsest properties when employed in an attractor neural network. The network does not suffer from catastrophic forgetting, has a capacity dependent on the learning time constant and exhibits faster convergence for newer patterns.

Cite

CITATION STYLE

APA

Lansner, A., Sandberg, A., Petersson, K. M., & Ingvar, M. (2000). On Forgetful Attractor Network Memories (pp. 54–62). https://doi.org/10.1007/978-1-4471-0513-8_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free