The content discusses the task of online source-free universal domain adaptation (online SF-UniDA), which aims to adapt a pre-trained source model to a target domain in an online manner without accessing the source data. This is a realistic but challenging scenario that has not been well studied despite its practical relevance.
The authors propose a novel method called Contrastive Mean Teacher (COMET) to address this task. COMET has the following key components:
Pseudo-labeling: COMET uses a mean teacher framework to generate reliable pseudo-labels for the target samples, leveraging entropy thresholds to handle samples of unknown classes.
Contrastive loss: COMET applies a contrastive loss to rebuild a feature space where samples of known classes form distinct clusters and samples of unknown classes are clearly separated from them.
Entropy loss: COMET uses an entropy loss to ensure that the classifier output has a small entropy for samples of known classes and a large entropy for unknown samples, enabling reliable rejection of unknown samples during inference.
The authors extensively evaluate COMET on two domain adaptation datasets, DomainNet and VisDA-C, across different category shift scenarios (partial-set, open-set, open-partial-set). COMET consistently outperforms competing methods and sets an initial benchmark for online SF-UniDA.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Pascal Schla... at arxiv.org 05-03-2024
https://arxiv.org/pdf/2401.17728.pdfDeeper Inquiries