核心概念
Formalizing complexity analysis of first-order optimization algorithms using Lean4.
摘要
The article discusses formalizing optimization techniques with the Lean4 theorem prover. It covers gradient and subgradient formalization, convex function properties, Lipschitz smooth functions, and convergence rates for gradient descent, subgradient descent, and proximal gradient methods.
統計資料
The convergence rate of the gradient descent algorithm is O(1/k) for convex functions and O(ρk) for strongly convex functions.
The convergence rate of the subgradient descent method is given as ∥xk − x∗∥2 ≤ (1 − α 2mL / (m + L)) ^ k ∥x0 − x∗∥2.
The convergence rate of the proximal gradient method is ψ(xk) − ψ∗ ≤ 1 / (2kt) ∥x0 − x∗∥2.