This survey presents a comprehensive overview of Nonnegative Matrix Factorization (NMF) and its applications in dimensionality reduction. It begins by providing a classification of dimensionality reduction methods, highlighting the key differences between feature extraction and feature selection approaches.
The paper then delves into the background of NMF, explaining its mathematical formulation and the use of the β-divergence function to define the loss function. This lays the foundation for the main taxonomy of NMF in dimensionality reduction.
For feature extraction, the survey categorizes NMF approaches into four main groups: Variants of NMF, Regularized NMF, Generalized NMF, and Robust NMF. It discusses the unique characteristics and advancements within each category, such as Symmetric NMF, Orthogonal NMF, Nonnegative Matrix Tri-Factorization, and Projective NMF.
Regarding feature selection, the paper analyzes NMF from six different perspectives: Standard NMF Problem, Convex-NMF Problem, Graph-Based, Dual Graph-Based, Sparsity, and Orthogonality Constraint. This comprehensive exploration provides valuable insights into the various ways NMF can be effectively applied for feature selection in dimensionality reduction tasks.
The survey also highlights the advantages and limitations of NMF compared to other dimensionality reduction techniques, as well as potential future research directions in this field.
Naar een andere taal
vanuit de broninhoud
arxiv.org
Belangrijkste Inzichten Gedestilleerd Uit
by Farid Saberi... om arxiv.org 05-07-2024
https://arxiv.org/pdf/2405.03615.pdfDiepere vragen