SKDF: A Simple Knowledge Distillation Framework for Distilling Open-Vocabulary Knowledge to Open-World Object Detector
The core message of this paper is that a simple knowledge distillation approach can effectively transfer the open-world knowledge from a large pre-trained vision-language model to a specialized open-world object detector, achieving better performance for unknown object detection compared to the teacher model.