Core Concepts
Source-Free Universal Domain Adaptation (SF-UniDA) is achieved through Global-Local Clustering (GLC) and Contrastive Affinity Learning, enhancing model upcycling under various category shifts.
Abstract
The content discusses the challenges of domain adaptation under covariate and category shifts, introducing SF-UniDA with GLC and GLC++ techniques. It explores pseudo-labeling algorithms, contrastive affinity learning, and local consensus clustering to improve model performance across different scenarios.
Introduction to SF-UniDA and the need for effective domain adaptation.
Explanation of GLC technique for identifying "known" and "unknown" data.
Enhancement of GLC to GLC++ with contrastive affinity learning.
Validation of techniques through experiments on benchmark datasets.
Comparison with existing methods in OPDA, OSDA, PDA, and CLDA scenarios.
Stats
Remarkably, in the most challenging open-partial-set scenarios, GLC and GLC++ surpass GATE by 16.7% and 18.6% in H-score on VisDA.
GLC++ enhances the novel category clustering accuracy of GLC by 4.3% in open-set scenarios on Office-Home.
Quotes
"We propose a novel Global and Local Clustering (GLC) technique."
"GLC++ integrates a contrastive affinity learning strategy."