Humans regularly perform new learning without losing memory for previous information, but neural network models suffer from the phenomenon of catastrophic forgetting in which new learning impairs prior function. A recent article presents an algorithm that spares learning at synapses important for previously learned function, reducing catastrophic forgetting.
CITATION STYLE
Hasselmo, M. E. (2017, June 1). Avoiding Catastrophic Forgetting. Trends in Cognitive Sciences. Elsevier Ltd. https://doi.org/10.1016/j.tics.2017.04.001
Mendeley helps you to discover research relevant for your work.