核心概念
The Sum-of-Squares (SOS) technique is a powerful tool for optimization problems and information theory.
摘要
The content discusses the theory and applications of the Sum-Of-Squares technique in optimization problems. It covers lectures on convex optimization, representation of non-negative functions, tightness of approximations, optimal control, kernel methods, and its extension to information theory focusing on the log partition function and Kernel KL divergence. The application of SOS in various fields is explored with detailed insights into its mathematical foundations.
I. Theory and Applications of Sum-Of-Squares Technique:
- Introduction to SOS method for deriving lower bounds in optimization.
- Application in finite-dimensional feature spaces extended to infinite-dimensional spaces using reproducing kernels.
II. Lecture 1 - Convex Optimization:
- Transformation of non-convex problems into solvable semidefinite programs using SOS.
- Representation of non-negative functions as sums-of-squares.
III. Lecture 2 - Optimal Control:
- Formulation of optimal control problems using HJB equation.
- Application of sum-of-squares relaxation for efficient solutions in optimal control scenarios.
IV. Connection to Information Theory:
- Computation of log partition function using SOS relaxation techniques.
- Exploration of Kernel KL divergence for information-theoretic computations.
統計資料
"Indeed the right hand side is clearly a convex problem, i.e., minimization over a convex set plus linear constraints in c, while on the left hand side we have a non-convex one in the most general setting."
"By exploiting the convexity of the problem (strong duality) we can invert inf and sup..."
"The solution is - inf A≽0 tr[AB] = (0 if B ≽ 0 +∞ otherwise)"
引述
"All problems are convex."
"A rather simple statement which is pivotal for the following discussion is..."
"If K = ˆK then the non-negative constraint can be expressed as a SOS..."