Centrala begrepp
Dataset distillation enhances model robustness through comprehensive evaluation.
Sammanfattning
The content introduces a benchmark for evaluating the adversarial robustness of distilled datasets, focusing on dataset distillation techniques and their impact on model performance. The study explores the relationship between dataset compression and model robustness, highlighting the importance of incorporating distilled data in training batches to improve model resilience against adversarial attacks.
I. Introduction
- Dataset distillation compresses datasets while maintaining performance.
- Prior works focus on accuracy, neglecting robustness evaluations.
II. Related Work
- Dataset distillation methods optimize data compression with competitive performance.
- Benchmarks like DC-Bench overlook adversarial robustness evaluation.
III. Investigations
- A new benchmark assesses adversarial robustness of distilled data comprehensively.
- Frequency domain analysis reveals correlations between knowledge extraction and robust features.
IV. Conclusion
- Incorporating distilled data in training enhances model robustness against attacks.
Statistik
著者は、データセットの圧縮とモデルの性能を維持することに焦点を当てています。
ベンチマークは、蒸留されたデータの敵対的な堅牢性を包括的に評価します。