Core Concepts
Proposing EASE for efficient model updating in Class-Incremental Learning using PTMs.
Abstract
The content introduces EASE, an approach for Pre-Trained Model-Based Class-Incremental Learning. It addresses the issue of forgetting old classes when learning new ones by utilizing lightweight adapters to create task-specific subspaces. The method also includes a semantic-guided prototype complement strategy to synthesize old class features without exemplars. Extensive experiments on benchmark datasets validate EASE's superior performance.
Introduction to Class-Incremental Learning and the challenges it poses.
Utilization of Pre-Trained Models (PTMs) and the issues related to forgetting old classes.
Proposal of ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
Explanation of how EASE works, including training adapters for task-specific subspaces and prototype complement strategy.
Comparison with state-of-the-art methods on various benchmark datasets.
Ablation study showing the effectiveness of each component in EASE.
Stats
"Extensive experiments on seven benchmark datasets verify EASE’s state-of-the-art performance."
"Parameter cost for saving adapters is 0.3% of the total backbone."
"EASE achieves best performance among all benchmarks, outperforming CODA-Prompt and ADAM."
Quotes
"No exemplars are used in EASE, making it competitive compared to traditional exemplar-based methods."
"EASE shows state-of-the-art performance with limited memory cost."