Artificial neural networks (ANNs) are known to suffer from catastrophic forgetting: when learning multiple tasks, they perform well on the most recently learned task while failing to perform on previously learned tasks. In biological networks, sleep is known to play a role in memory consolidation and incremental learning. Motivated by the processes that are known to be involved in sleep generation in biological networks, we developed an algorithm that implements a sleeplike phase in ANNs. In an incremental learning framework, we demonstrate that sleep is able to recover older tasks that were otherwise forgotten. We show that sleep creates unique representations of each class of inputs and neurons that were relevant to previous tasks fire during sleep, simulating replay of previously learned memories.
CITATION STYLE
Tadros, T., Krishnan, G., Ramyaa, R., & Bazhenov, M. (2020). Biologically inspired sleep algorithm for reducing catastrophic forgetting in neural networks (student abstract). In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 13933–13934). AAAI press.
Mendeley helps you to discover research relevant for your work.