toplogo
로그인

ECAP: Extensive Cut-and-Paste Augmentation for Unsupervised Domain Adaptive Semantic Segmentation


핵심 개념
ECAP proposes a novel data augmentation strategy to reduce the impact of erroneous pseudo-labels in unsupervised domain adaptive semantic segmentation.
초록
Abstract: Unsupervised domain adaptation (UDA) for semantic segmentation aims to adapt a model trained on a labeled source dataset to an unlabeled target dataset. Current self-training methods struggle with misclassified pseudo-labels, especially for certain classes in UDA. ECAP introduces an extensive cut-and-paste strategy to leverage reliable pseudo-labels through data augmentation, boosting performance on domain adaptation benchmarks. Introduction: UDA relaxes requirements on annotated training data by utilizing source and target datasets from different distributions. Self-training methods, like DACS augmentation, bridge the domain gap but struggle with noisy pseudo-labels. ECAP aims to counteract pseudo-label noise by cut-and-pasting confident samples from a memory bank during training. Extensive Cut-and-Paste (ECAP): ECAP consists of a memory bank, a sampler, and an augmentation module to leverage confident pseudo-labels for training. The memory bank stores pseudo-labeled target samples, the sampler selects high-confidence samples, and the augmentation module creates composite images for training. Experiments: ECAP is evaluated on synthetic-to-real domain adaptation benchmarks, achieving state-of-the-art performance on GTA→Cityscapes and Synthia→Cityscapes. Performance on day-to-nighttime and clear-to-adverse-weather benchmarks shows limitations of ECAP in low-visibility conditions. Conclusion: ECAP offers a promising approach to reduce pseudo-label noise in semantic segmentation, enhancing performance in unsupervised domain adaptation tasks.
통계
"MIC+ECAP reaches an unprecedented performance of 69.1 mIoU on the Synthia→Cityscapes benchmark." "ECAP boosts the performance of the recent method MIC by 0.3 and 1.8 mIoU on synthetic-to-real domain adaptation benchmarks." "ECAP increases the mIoU of DAFormer significantly, although not as much as DAFormer (denoise) and (oracle)."
인용구
"ECAP introduces an extensive cut-and-paste strategy to leverage reliable pseudo-labels through data augmentation." "Our method, which we call ECAP: Extensive Cut-and-Paste, is the first UDA method for semantic segmentation to increase the proportion of correct pseudo-labels in each training image." "Through comprehensive evaluation, ECAP is shown to increase the performance of multiple UDA methods based on self-training."

핵심 통찰 요약

by Erik... 게시일 arxiv.org 03-07-2024

https://arxiv.org/pdf/2403.03854.pdf
ECAP

더 깊은 질문

질문 1

ECAP의 가짜 라벨 노이즈 감소 방법은 시맨틱 세그멘테이션을 넘어 다른 도메인에 어떻게 적용될 수 있을까요? Answer 1 here

질문 2

ECAP의 잠재적인 단점이나 한계는 어떤 것들이 있고, 실제 응용 프로그램에서 고려해야 할 사항은 무엇일까요? Answer 2 here

질문 3

ECAP의 컷 앤 페이스트 데이터 증강 개념은 다른 유형의 머신 러닝 작업에 어떻게 적응될 수 있을까요? Answer 3 here
0
visual_icon
generate_icon
translate_icon
scholar_search_icon
star