Основні поняття
The scaled generalized minimax concave (sGMC) model, a nonconvex extension of the LASSO model, preserves many favorable properties of LASSO, including unique and sparse solutions, as well as a piecewise linear regularization path.
Анотація
The paper studies the solution-set geometry and regularization path of the scaled generalized minimax concave (sGMC) model, a nonconvex sparse regression model that can preserve the overall-convexity of the optimization problem.
Key highlights:
For a fixed regularization parameter λ, the sGMC solution set is nonempty, closed, bounded and convex. The sGMC solution is unique and sparse (at most min{m,n} nonzero components) with probability one if the sensing matrix A has columns in general position.
For a varying λ, the extended sGMC solution set (the Cartesian product of the primal and dual sGMC solution sets) is a continuous, piecewise polytope-valued mapping of λ. The minimum ℓ2-norm extended regularization path of the sGMC model remains piecewise linear, similar to the LASSO model.
Exploiting the theoretical results, an efficient regularization path algorithm called LARS-sGMC is proposed, which extends the well-known least angle regression (LARS) algorithm for LASSO. The LARS-sGMC algorithm is proven to be correct and finite-terminating under a mild assumption.
The results show that despite the nonconvex nature of the sGMC penalty, the sGMC model preserves many celebrated properties of the LASSO model, making it a promising less biased surrogate of LASSO.
Статистика
The upper bound of the ℓ1-norm of the sGMC solution is 1/(2λ(1-ρ))||y||2.
Every sGMC solution gives the same data fidelity and the same sGMC penalty value.