Distilled Replay: Overcoming Forgetting Through Synthetic Samples

9Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training. The amount of patterns stored in the buffer is a critical parameter which largely influences the final performance and the memory footprint of the approach. This work introduces Distilled Replay, a novel replay strategy for Continual Learning which is able to mitigate forgetting by keeping a very small buffer (1 pattern per class) of highly informative samples. Distilled Replay builds the buffer through a distillation process which compresses a large dataset into a tiny set of informative examples. We show the effectiveness of our Distilled Replay against popular replay-based strategies on four Continual Learning benchmarks.

Cite

CITATION STYLE

APA

Rosasco, A., Carta, A., Cossu, A., Lomonaco, V., & Bacciu, D. (2022). Distilled Replay: Overcoming Forgetting Through Synthetic Samples. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13418 LNAI, pp. 104–117). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-17587-9_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free