Semantically-Shifted Incremental Adapter-Tuning: A Continual Learning Framework for ViTransformers
The core message of this paper is that incrementally tuning the shared adapter without imposing parameter update constraints is an effective continual learning strategy for pre-trained vision transformers, and further performance improvements can be achieved by retraining a unified classifier with semantic shift-compensated prototypes.