Core Concepts
Enhancing few-shot class-incremental learning through orthogonality and contrast.
Abstract
The OrCo framework addresses challenges in Few-Shot Class-Incremental Learning (FSCIL) by focusing on orthogonality and contrast. It introduces a novel approach to improve generalization and mitigate issues like catastrophic forgetting, overfitting, and intransigence. The framework consists of three phases: pretraining, base alignment, and few-shot alignment, each utilizing a combination of supervised and self-supervised contrastive losses. Experimental results demonstrate state-of-the-art performance across benchmark datasets.
Introduction
FSCIL introduces challenges like catastrophic forgetting and overfitting.
OrCo framework addresses these challenges through orthogonality and contrast.
OrCo Framework
Utilizes features' orthogonality and contrastive learning.
Three phases: pretraining, base alignment, and few-shot alignment.
Combines supervised and self-supervised contrastive losses.
Experimental Results
Showcase state-of-the-art performance on mini-ImageNet, CIFAR100, and CUB datasets.
OrCo outperforms previous methods across all datasets.
Related Work
Discusses few-shot learning, class-incremental learning, and FSCIL methods.
Conclusion
OrCo method effectively addresses challenges in FSCIL through orthogonality and contrast.
Stats
Few-Shot Class-Incremental Learning (FSCIL) introduces challenges.
OrCo framework focuses on orthogonality and contrast.
Experimental results showcase state-of-the-art performance.
Quotes
"Our experimental results showcase state-of-the-art performance across three benchmark datasets."
"OrCo framework is a novel approach that tackles challenges in FSCIL through orthogonality and contrast."