Core Concepts
The authors present an overview of the Sum-of-Squares (SOS) method used in optimization problems to transform non-convex global optimization problems into solvable semidefinite programs.
Abstract
The Sum-of-Squares (SOS) approximation method is a powerful technique used in optimization problems to derive lower bounds on objective functions. By representing functions as sums of squares in feature spaces, the SOS method simplifies complex optimization tasks. The authors explore applications in finite-dimensional and infinite-dimensional feature spaces using reproducing kernels. Additionally, they discuss the utilization of SOS for estimating quantities in information theory like the log-partition function. The content delves into convex duality formulations, sum-of-squares representations of non-negative functions, and tightness of approximations. It also extends the discussion to optimal control problems and connections to information theory through log partition functions and kernel KL divergence.
Stats
h(x) = φ(x)∗Hφ(x)
H ∈ Hd, set of Hermitian matrices in Cd×d
Proposition 1: H ≽ 0 and H ∈ Hd
Proposition 2: If h is a SOS, then h is non-negative.
Proposition 3: h(x) = φ(x)∗Hφ(x) is a SOS if there exists H′ ∈ V⊥ such that H - H′ ≽ 0.
Kernel KL divergence formula: D(Σp∥Σq) = tr(Σp(log Σp - log Σq))