Core Concepts
Enhancing sparsity and robustness in unsupervised feature learning through Graph Regularized NMF with L20-norm.
Abstract
Nonnegative Matrix Factorization (NMF) is widely used in machine learning and data mining. Graph Regularized Non-negative Matrix Factorization (GNMF) improves clustering and dimensionality reduction but is sensitive to noise. Introducing the L20-norm constraint enhances feature sparsity, mitigates noise impact, and aids effective feature selection. An unsupervised feature learning framework based on GNMF L20 is proposed, utilizing PALM algorithm for optimization. The convergence of the algorithms is established through experiments on simulated and real image data.
Stats
W ∈ Rp×r, H ∈ Rr×n, X ∈ Rp×n, A ∈ Rn×n, λ Tr(HLHT)
ℓ2,0-norm constraint encourages reduction of less significant features for sparse representation.
iPALM and BPL are non-convex optimization algorithms used for acceleration.