Alleviate Representation Overlapping in Class Incremental Learning by Contrastive Class Concentration

07/26/2021
by   Zixuan Ni, et al.
0

The challenge of the Class Incremental Learning (CIL) lies in difficulty for a learner to discern the old classes' data from the new as no previous classes' data is preserved. In this paper, we reveal three causes for catastrophic forgetting at the representational level, namely, representation forgetting, representation overlapping, and classifier deviation. Based on the observation above, we propose a new CIL framework, Contrastive Class Concentration for CIL (C4IL) to alleviate the phenomenon of representation overlapping that works in both memory-based and memory-free methods. Our framework leverages the class concentration effect of contrastive representation learning, therefore yielding a representation distribution with better intra-class compatibility and inter-class separability. Quantitative experiments showcase the effectiveness of our framework: it outperforms the baseline methods by 5 average and top-1 accuracy in 10-phase and 20-phase CIL. Qualitative results also demonstrate that our method generates a more compact representation distribution that alleviates the overlapping problem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset