핵심 개념
Imbalanced data poses challenges in class-incremental learning, addressed by reweighting gradients for balanced optimization and unbiased classifier learning.
초록
Class-Incremental Learning (CIL) faces challenges with imbalanced data distribution, leading to skewed gradient updates and catastrophic forgetting. The proposed method reweights gradients to balance optimization and mitigate forgetting. Distribution-aware knowledge distillation loss aligns output logits with lost training data distribution. Experimental results show consistent improvements across various datasets and evaluation protocols.
통계
A major challenge of CIL arises when applying to real-world data characterized by non-uniform distribution.
Our method addresses it by reweighting the gradients towards balanced optimization and unbiased classifier learning.
We validate our method on CIFAR-100, ImageNetSubset, and Food101 across various evaluation protocols.
Our method achieves promising results under LFH and significant improvements under LFS even without necessitating extra training stages and parameters.
Our method shows consistent improvements under CIL and proves effective in long-tailed recognition.