This paper introduces Differentiable Information Imbalance (DII), a novel feature selection method that identifies and weights informative features by minimizing the discrepancy between input and ground truth distance spaces, improving data representation and machine learning potential training.
Shap-select is a new feature selection framework that improves the performance of machine learning models by combining SHAP values with statistical significance testing during the model training process.
This paper introduces a novel feature selection method called Integrated Path Stability Selection (IPSS) for thresholding, which leverages gradient boosting (IPSSGB) and random forests (IPSSRF) to achieve superior performance in terms of error control, true positive detection, and computational efficiency compared to existing methods.
Greedy feature selection identifies the most important feature at each step according to the selected classifier, improving model performance.
Effiziente Multi-Objective Genetischer Algorithmus für Multi-View Feature Selection bietet überlegene Leistung und Interpretierbarkeit für die Auswahl von Merkmalen in Multi-View-Datensätzen.
Understanding the impact of multivariate symmetrical uncertainty on feature selection.
Neue Methode GRROOR für Multi-Label Feature Selection durch globale Redundanz- und Relevanzoptimierung in orthogonaler Regression.
BoUTS introduces a novel feature selection algorithm that identifies universal and task-specific features, enhancing interpretability and performance across diverse datasets.