Temel Kavramlar
Utilizing an energy-based model jointly trained with a classifier to improve confidence calibration and enhance the effectiveness of pseudo-label learning.
Özet
The paper proposes an energy-based pseudo-label learning (EBPL) algorithm that leverages an energy-based model (EBM) to improve confidence calibration and the effectiveness of pseudo-label learning.
Key highlights:
In pseudo-label learning, accurate confidence scores are crucial for selecting appropriate samples to assign pseudo-labels. However, deep neural networks often suffer from over-confidence issues, leading to poor confidence calibration.
EBPL addresses this by jointly training an NN-based classifier and an EBM, which share their feature extraction parts. This allows the model to learn both the class decision boundary and the input data distribution, enhancing confidence calibration.
The experimental results on image classification tasks demonstrate that EBPL outperforms existing pseudo-label learning methods in terms of accuracy, F-score, and expected calibration error. EBPL is particularly effective when the number of labeled data is extremely limited.
Qualitative analysis shows that EBPL assigns more appropriate pseudo-labels by producing lower confidence scores for misclassified samples compared to the baseline method.
İstatistikler
The accuracy of pseudo-labeling at each step is higher for EBPL compared to the baseline method across the CIFAR-10 and Blood-MNIST datasets.
Alıntılar
"By referring to calibrated confidence, we can assign more accurate pseudo-labels, leading to more successful PL."
"EBPL demonstrated higher PL accuracy throughout the entire training process."