AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation
In creating the content, the author argues that transferring knowledge from large models to lightweight models without access to training data is challenging. The proposed method, AuG-KD, effectively addresses this challenge by aligning student-domain data with the teacher domain and balancing OOD knowledge distillation with domain-specific information learning.