Core Concepts
Efficiently finding approximate second-order stationary points in the presence of outliers.
Abstract
The paper addresses the challenge of learning in the presence of corrupted data, focusing on robust nonconvex optimization. It introduces a framework for efficiently finding approximate second-order stationary points (SOSPs) with dimension-independent accuracy guarantees. The study includes applications to low rank matrix sensing, highlighting the importance of SOSPs in nonconvex formulations of machine learning problems. The work establishes a statistical query lower bound, indicating the necessity of quadratic sample complexity for efficient algorithms. Overall, the research provides insights into addressing outlier-robust stochastic optimization problems.
Stats
n = eO(D2/ǫ)
Lg = 16Γ and LH = 24Γ1/2 inside region {U : ∥U∥2op < Γ}
σg = 8rΓ1.5 and σH = 16r1.5Γ
Quotes
"Finding an approximate second-order stationary point (SOSP) is a well-studied and fundamental problem in stochastic nonconvex optimization."
"In this paper, we study the problem of finding SOSPs in the strong contamination model."
"Our work is the first to find approximate SOSPs with dimension-independent errors in outlier-robust settings."