Core Concepts
Integrating kernel functions and kernel alignment improves unsupervised feature selection by capturing nonlinear relationships among features.
Abstract
The content discusses the importance of unsupervised feature selection in high-dimensional data processing. It introduces the concept of kernel alignment to enhance subspace learning, focusing on capturing nonlinear structural information among features. The proposed methods, KAUFS and MKAUFS, outperform classic and state-of-the-art unsupervised feature selection techniques in clustering results and redundancy reduction across various datasets.
Introduction
High-dimensional data complexity amplifies computational demands.
Unsupervised feature selection crucial for reducing irrelevant features.
Subspace Learning
Subspace learning effective for projecting high-dimensional space into a representative subspace.
Various regularization frameworks aid in noise removal and dimensionality reduction.
Kernel Alignment
Kernel alignment evaluates similarity between original and selected features' kernels.
Non-negative matrix factorization used for efficient algorithm development.
Multiple Kernel Method
Multiple kernel learning addresses challenges of single kernel models.
Consensus kernel matrix enhances robustness and performance.
Algorithm and Convergence Analysis
Iterative update rules ensure convergence of KAUFS and MKAUFS methods.
Computational Complexity
Computational complexity analysis shows efficiency of KAUFS and MKAUFS algorithms.
Numerical Experiments
Evaluation metrics like ACC, NMI, and RED used to compare proposed methods with existing UFS techniques on diverse datasets.
Stats
Most existing matrix factorization-based unsupervised feature selection methods are built upon subspace learning.
Experimental analysis demonstrates that the proposed methods outperform other classic UFS methods in clustering results.