Improving Knowledge Distillation using Orthogonal Projections by Roy Miles, Ismail Elezi, and Jiankang Deng at Huawei Noah’s Ark Lab
The authors propose a novel constrained feature distillation method based on orthogonal projections and task-specific normalization to enhance knowledge transfer in deep learning models. By enforcing the preservation of feature similarity through orthogonal projections, they achieve significant performance improvements across various tasks.