Self-Supervised Dataset Distillation for Efficient Transfer Learning
The authors propose a novel self-supervised dataset distillation framework that distills an unlabeled dataset into a small set of synthetic samples, which can be effectively used for pre-training and transfer learning on various downstream tasks.