The content introduces CAKE, a novel approach for knowledge distillation without requiring access to original data. It highlights the effectiveness of CAKE in mimicking decision boundaries and generating synthetic samples. The method is compared with existing techniques and shows promising results across various datasets and model types.
Access to pre-trained models has become standard, but original training data may not be accessible. CAKE proposes a solution by generating synthetic samples that mimic decision boundaries effectively. The method is empirically validated on benchmark datasets, showcasing competitive classification accuracy.
CAKE's effectiveness is demonstrated through ablation studies on CIFAR-10, showing improvements in student accuracy with components like contrastive loss and prior knowledge injection. The method also proves successful in compressing models of different depths and types.
Furthermore, CAKE is compared against tailored methods, highlighting its ability to achieve comparable performance without common assumptions or data access requirements. Future research directions include exploring privacy-preserving methodologies using synthetic samples generated by CAKE.
Sang ngôn ngữ khác
từ nội dung nguồn
arxiv.org
Thông tin chi tiết chính được chắt lọc từ
by Steven Braun... lúc arxiv.org 03-12-2024
https://arxiv.org/pdf/2306.02090.pdfYêu cầu sâu hơn