The author proposes a shifted normal distribution sampling function to enhance cost efficiency in active learning, particularly in cases of imbalanced labeling costs for positive and negative instances.
The author introduces the TCM heuristic to combine diversity-based and uncertainty-based sampling strategies in active learning, leveraging self-supervised pre-trained models for improved performance across various data levels.
The author introduces the novel active learning method SUPClust, focusing on identifying points at decision boundaries to enhance model performance through informative data labeling.
Pre-trained models combined with active learning strategies can reduce annotation costs, but the performance drop in proxy-based active learning needs to be addressed.
Encouraging students to actively participate in the learning process through self-generated tests improves academic performance.
다양성과 불확실성을 결합한 활성 학습의 효과적인 전략 소개