Conceitos Básicos
SCARCE proposes a consistent approach for complementary-label learning without relying on distribution assumptions, showing superior performance.
Resumo
The paper introduces SCARCE, a novel approach for complementary-label learning that does not depend on distribution assumptions. It addresses overfitting issues and demonstrates superior performance on benchmark datasets. The study compares SCARCE with existing methods and validates its effectiveness through extensive experiments on synthetic and real-world datasets.
-
Introduction
- Complementary-label learning is a weakly supervised learning problem.
- Training data with complementary labels is easier to collect than ordinary-label data.
- Benefits of complementary-label learning demonstrated in various machine learning applications.
-
Data Extraction
- "Extensive experimental results on both synthetic and real-world benchmark datasets validate the superiority of our proposed approach over state-of-the-art methods."
- "SCARCE achieves the best performance in 39 out of 40 cases with different distribution assumptions and datasets."
-
Quotations
- "We propose a novel consistent approach that does not rely on these conditions."
- "Inspired by the positive-unlabeled (PU) learning literature, we propose an unbiased risk estimator based on the Selected Completely At Random assumption for complementary-label learning."
Estatísticas
"Extensive experimental results on both synthetic and real-world benchmark datasets validate the superiority of our proposed approach over state-of-the-art methods."
Citações
"We propose a novel consistent approach that does not rely on these conditions."
"Inspired by the positive-unlabeled (PU) learning literature, we propose an unbiased risk estimator based on the Selected Completely At Random assumption for complementary-label learning."