The paper addresses the challenge of learning in the presence of corrupted data, focusing on robust nonconvex optimization. It introduces a framework for efficiently finding approximate second-order stationary points (SOSPs) with dimension-independent accuracy guarantees. The study includes applications to low rank matrix sensing, highlighting the importance of SOSPs in nonconvex formulations of machine learning problems. The work establishes a statistical query lower bound, indicating the necessity of quadratic sample complexity for efficient algorithms. Overall, the research provides insights into addressing outlier-robust stochastic optimization problems.
Til et andet sprog
fra kildeindhold
arxiv.org
Vigtigste indsigter udtrukket fra
by Shuyao Li,Yu... kl. arxiv.org 03-19-2024
https://arxiv.org/pdf/2403.10547.pdfDybere Forespørgsler