This paper introduces Balanced and Entropy-based Mix (BEM), a novel data mixing approach that re-balances both the class distribution of data quantity and uncertainty to enhance long-tailed semi-supervised learning.
To address the challenge of long-tailed semi-supervised learning, where labeled data exhibit imbalanced class distribution and unlabeled data follow an unknown distribution, the authors propose a novel method named ComPlementary Experts (CPE) that trains multiple experts to model various class distributions, each yielding high-quality pseudo-labels within one form of class distribution. They also introduce Classwise Batch Normalization to avoid performance degradation caused by feature distribution mismatch between head and non-head classes.
This research paper introduces Continuous Contrastive Learning (CCL), a novel method for Long-Tailed Semi-Supervised Learning (LTSSL) that leverages a probabilistic framework to unify existing long-tail learning approaches and utilizes continuous pseudo-labels for improved representation learning in imbalanced datasets.