In this paper, we propose a novel single-task continual learning framework named Bi-Objective Continual Learning (BOCL). BOCL aims at both consolidating historical knowledge and learning from new data. On one hand, we propose to preserve the old knowledge using a small set of pillars, and develop the pillar consolidation (PLC) loss to preserve the old knowledge and to alleviate the catastrophic forgetting problem. On the other hand, we develop the contrastive pillar (CPL) loss term to improve the classification performance, and examine several data sampling strategies for efficient onsite learning from 'new' with a reasonable amount of computational resources. Comprehensive experiments on CIFAR10/100, CORe50 and a subset of ImageNet validate the BOCL framework. We also reveal the performance accuracy of different sampling strategies when used to finetune a given CNN model. The code will be released.
CITATION STYLE
Tao, X., Hong, X., Chang, X., & Gong, Y. (2020). Bi-objective continual learning: Learning “New” while consolidating “Known.” In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 5989–5996). AAAI press. https://doi.org/10.1609/aaai.v34i04.6060
Mendeley helps you to discover research relevant for your work.