The paper proposes a Potential Energy-based Mixture Model (PEMM) for robust learning from noisy labels. The key ideas are:
Inherent data information: The authors argue that the inherent data structure can be modeled by fitting a mixture model, and representations that preserve intrinsic structures from the data can make the training less dependent on class labels and more robust.
Distance-based classifier: The authors use a distance-based classifier with Reverse Cross Entropy (RCE) loss to capture the penalty of the predicted distribution given the true distribution.
Potential energy-based centers regularization: Inspired by the concept of potential energy in physics, the authors introduce a potential energy-based regularization on the class centers to encourage a co-stable state among them, which can help preserve the intrinsic data structure.
The authors conduct extensive experiments on benchmark datasets with various types and rates of label noise. The results show that the proposed PEMM can achieve state-of-the-art performance in handling noisy labels, outperforming other recent methods. The authors also provide detailed analysis and ablation studies to demonstrate the effectiveness of the individual components of PEMM.
To Another Language
from source content
arxiv.org
Key Insights Distilled From
by Zijia Wang,W... at arxiv.org 05-03-2024
https://arxiv.org/pdf/2405.01186.pdfDeeper Inquiries