핵심 개념
ECAP proposes a novel data augmentation strategy to reduce the impact of erroneous pseudo-labels in unsupervised domain adaptive semantic segmentation.
초록
Abstract:
Unsupervised domain adaptation (UDA) for semantic segmentation aims to adapt a model trained on a labeled source dataset to an unlabeled target dataset.
Current self-training methods struggle with misclassified pseudo-labels, especially for certain classes in UDA.
ECAP introduces an extensive cut-and-paste strategy to leverage reliable pseudo-labels through data augmentation, boosting performance on domain adaptation benchmarks.
Introduction:
UDA relaxes requirements on annotated training data by utilizing source and target datasets from different distributions.
Self-training methods, like DACS augmentation, bridge the domain gap but struggle with noisy pseudo-labels.
ECAP aims to counteract pseudo-label noise by cut-and-pasting confident samples from a memory bank during training.
Extensive Cut-and-Paste (ECAP):
ECAP consists of a memory bank, a sampler, and an augmentation module to leverage confident pseudo-labels for training.
The memory bank stores pseudo-labeled target samples, the sampler selects high-confidence samples, and the augmentation module creates composite images for training.
Experiments:
ECAP is evaluated on synthetic-to-real domain adaptation benchmarks, achieving state-of-the-art performance on GTA→Cityscapes and Synthia→Cityscapes.
Performance on day-to-nighttime and clear-to-adverse-weather benchmarks shows limitations of ECAP in low-visibility conditions.
Conclusion:
ECAP offers a promising approach to reduce pseudo-label noise in semantic segmentation, enhancing performance in unsupervised domain adaptation tasks.
통계
"MIC+ECAP reaches an unprecedented performance of 69.1 mIoU on the Synthia→Cityscapes benchmark."
"ECAP boosts the performance of the recent method MIC by 0.3 and 1.8 mIoU on synthetic-to-real domain adaptation benchmarks."
"ECAP increases the mIoU of DAFormer significantly, although not as much as DAFormer (denoise) and (oracle)."
인용구
"ECAP introduces an extensive cut-and-paste strategy to leverage reliable pseudo-labels through data augmentation."
"Our method, which we call ECAP: Extensive Cut-and-Paste, is the first UDA method for semantic segmentation to increase the proportion of correct pseudo-labels in each training image."
"Through comprehensive evaluation, ECAP is shown to increase the performance of multiple UDA methods based on self-training."