The paper introduces a new method called SCARCE for complementary-label learning, addressing overfitting issues and providing superior performance. It explores the relationship between complementary-label learning and negative-unlabeled learning. Extensive experiments validate the effectiveness of SCARCE on synthetic and real-world datasets.
Existing approaches rely on uniform or biased distribution assumptions, which may not hold in real-world scenarios. The proposed SCARCE method does not require these assumptions, offering a more practical solution. The study also investigates the impact of inaccurate class priors on classification performance.
SCARCE outperforms state-of-the-art methods in most cases, demonstrating its robustness and effectiveness in various settings. The theoretical analysis provides insights into the convergence properties and calibration to 0-1 loss.
Til et andet sprog
fra kildeindhold
arxiv.org
Vigtigste indsigter udtrukket fra
by Wei Wang,Tak... kl. arxiv.org 03-08-2024
https://arxiv.org/pdf/2311.15502.pdfDybere Forespørgsler