FOCIL proposes a novel approach for online class incremental learning, achieving high accuracy, minimal forgetting, and faster training without the need to store replay data.
FOCILは、オンラインクラス増分学習においてランダムに剪定された疎な専門家を訓練することで、主要なアーキテクチャを微調整し、忘却を防ぐことを目指しています。
The proposed method improves generalization in online class incremental learning by incorporating a self-distillation mechanism and a new memory update method that prioritizes the storage of easily misclassified samples.