Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training. The amount of patterns stored in the buffer is a critical parameter which largely influences the final performance and the memory footprint of the approach. This work introduces Distilled Replay, a novel replay strategy for Continual Learning which is able to mitigate forgetting by keeping a very small buffer (1 pattern per class) of highly informative samples. Distilled Replay builds the buffer through a distillation process which compresses a large dataset into a tiny set of informative examples. We show the effectiveness of our Distilled Replay against popular replay-based strategies on four Continual Learning benchmarks.
CITATION STYLE
Rosasco, A., Carta, A., Cossu, A., Lomonaco, V., & Bacciu, D. (2022). Distilled Replay: Overcoming Forgetting Through Synthetic Samples. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13418 LNAI, pp. 104–117). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-17587-9_8
Mendeley helps you to discover research relevant for your work.