提案されたトリックを組み合わせたフレームワークは、FSCILの安定性、適応性、全体的なパフォーマンスを向上させる。
Enhancing few-shot class-incremental learning through orthogonality and contrast.
Calibrating the covariance matrices of new few-shot classes based on their semantic similarity to the many-shot base classes significantly improves the classification performance in few-shot class-incremental learning settings.
The core message of this paper is to propose a Knowledge Adaptation Network (KANet) that effectively adapts the CLIP model to the few-shot class-incremental learning (FSCIL) task by fusing general knowledge from CLIP and task-specific knowledge through a Knowledge Adapter module, and further optimizing the knowledge adaptation using an Incremental Pseudo Episode Learning scheme.
Contrary to the common practice of maximizing inter-class distance in few-shot class-incremental learning (FSCIL), minimizing inter-class distance, in conjunction with promoting feature spread, achieves a better balance between discriminability on base classes and transferability to new classes, leading to superior performance in FSCIL.
従来のクラス間距離の最大化ではなく、クラス間距離の最小化が、少数ショットクラス逐次学習における表現学習の転移性と識別性の両方を向上させる。