Core Concepts
Enhancing prediction performance through a novel hybrid sampling approach in the iBRF classifier.
Abstract
The article discusses the challenges of class imbalance in classification tasks and proposes an improved Balanced Random Forest (iBRF) classifier. The iBRF algorithm combines neighborhood cleaning, random undersampling, and SMOTE to balance class distribution. By integrating this hybrid sampling technique with the Random Forest architecture, better generalization and prediction performance are achieved. Experimental results on 44 imbalanced datasets show significant improvements over traditional sampling techniques and other ensemble approaches.
Stats
Experiments on 44 imbalanced datasets showed an average MCC score of 53.04% and an F1 score of 55% for the proposed iBRF algorithm.
Quotes
"Our proposed hybrid sampling technique achieves better prediction performance than other sampling techniques used in imbalanced classification tasks."
"The iBRF algorithm outperformed other ensemble approaches by producing superior MCC scores."