A novel relation-based knowledge distillation framework that combines adaptive affinity-based and kernel-based distillation to enable lightweight student models to effectively replicate the feature representations of powerful teacher models, facilitating robust performance even in the face of domain shift and data heterogeneity.
DMLs enhance segmentation accuracy and model calibration by supporting soft labels.