A simple and effective method, named Learning to Bootstrap (L2B), enables models to bootstrap themselves using their own predictions without being adversely affected by erroneous pseudo-labels by dynamically adjusting the importance weight between real observed and generated labels, as well as between different samples through meta-learning.
A novel Two-Stream Sample Distillation (TSSD) framework is designed to train a robust network under the supervision of noisy labels by jointly considering the sample structure in feature space and the human prior in loss space.
A novel Potential Energy-based Mixture Model (PEMM) that can effectively handle noisy labels by preserving the intrinsic data structure and achieving a co-stable state among class centers.