This paper proposes IPDS-ADMM, a novel proximal linearized ADMM algorithm employing an increasing penalty and decreasing smoothing strategy, to efficiently solve multi-block nonconvex composite optimization problems with minimal continuity assumptions, achieving an oracle complexity of O(ǫ−3) for an ǫ-approximate critical point.
This paper introduces novel algorithms that leverage negative curvature information to efficiently find second-order stationary points in noisy nonlinear nonconvex optimization problems, crucial for machine learning applications.
This paper investigates the presence of spurious local minima in low-rank formulations of sum-of-squares optimization problems, establishing connections between algebraic geometry and optimization, and providing theoretical results and algorithmic strategies to address these challenges.
This paper introduces a novel inexact augmented Lagrangian method (ALM) employing a non-standard augmenting term (a Euclidean norm raised to a power between one and two) to solve nonconvex optimization problems with nonlinear equality constraints. The authors demonstrate both theoretically and empirically that this method allows for faster constraint satisfaction compared to traditional ALM, at the cost of slower minimization of the dual residual, offering a beneficial trade-off for certain practical problems.
This paper introduces a novel penalty barrier method for solving nonconvex constrained optimization problems, employing a marginalization technique to handle slack variables, resulting in smooth subproblems suitable for accelerated solvers.
Efficient algorithm for nonconvex minimization with inexact evaluations.
提案されたSHGDアルゴリズムは、非対称因子化を使用してスペクトル圧縮センシング問題を解決し、収束性と効率性を示す。
Efficiently finding approximate second-order stationary points in the presence of outliers.