Class incremental learning is an online learning paradigm wherein the classes to be recognized are gradually increased with limited memory, storing only a partial set of examples of past tasks. At a task transition, we observe an unintentional imbalance of confidence or likelihood between the classes of the past and the new task. We argue that the imbalance aggravates a catastrophic forgetting for class incremental learning. We propose a simple yet effective learning objective to balance the confidence of classes of old tasks and new task in the class incremental learning setup. In addition, we compare various sample memory configuring strategies and propose a novel sample memory management policy to alleviate the forgetting further. The proposed method outperforms the state of the arts in many evaluation metrics including accuracy and forgetting F by a large margin (up to 5.71% in A10 and 17.1% in F10) in extensive empirical validations on multiple visual recognition datasets such as CIFAR100, TinyImageNet and a subset of the ImageNet.
CITATION STYLE
Kang, D., Jo, Y., Nam, Y., & Choi, J. (2020). Confidence Calibration for Incremental Learning. IEEE Access, 8, 126648–126660. https://doi.org/10.1109/ACCESS.2020.3007234
Mendeley helps you to discover research relevant for your work.