The author proposes Elastic Feature Consolidation to address the challenges of Cold Start Exemplar-free Incremental Learning, utilizing an Empirical Feature Matrix to regulate feature drift and an Asymmetric Prototype Replay loss to balance new task data with prototypes.
Proposing a non-exemplar semi-supervised class-incremental learning framework to address the limitations of existing methods.