The author proposes Contrastive Continual Learning via Importance Sampling (CCLIS) to preserve knowledge by recovering previous data distributions and introduces the Prototype-instance Relation Distillation (PRD) loss to maintain relationships between prototypes and sample representations. The core reasoning behind the content is to address Catastrophic Forgetting in continual learning by utilizing importance sampling and PRD to enhance knowledge preservation.
高品質な表現を持つ比較的学習方法に焦点を当て、従来の連続設定での壊滅的忘却問題を回避するために、知識保存と Catastrophic Forgetting の効果的な対策を提案します。
고품질 표현을 통한 Catastrophic Forgetting 극복을 위한 Contrastive Continual Learning의 효과적인 방법론 소개