Core Concepts
Optimistic Online Mirror Descent provides theoretical guarantees for the SEA model with various types of functions.
Abstract
The content discusses the Optimistic Online Mirror Descent algorithm for bridging stochastic and adversarial online convex optimization. It explores theoretical guarantees for different function types, including convex, strongly convex, and exp-concave functions. The analysis includes updates, regularizers, and regret bounds for each function type.
Introduction to Online Convex Optimization (OCO)
The Stochastically Extended Adversarial (SEA) Model
Results for Convex and Smooth Functions
Results for Strongly Convex and Smooth Functions
Results for Exp-Concave and Smooth Functions
Stats
5D2/√(δ) + 5G2 + 25D3L2√(δ)
O(1/λσmax + Σmax log(σ1:T + Σ1:T)/(σmax + Σmax))
O(dα log(σ1:T + Σ1:T))
Quotes
"We investigate the theoretical guarantees of optimistic Online Mirror Descent (OMD) for the SEA model with smooth expected loss functions."
"Our approach yields a new O(d log(σ2 1:T + Σ2 1:T)) bound."
"Optimistic FTRL can achieve the same guarantee as Sachs et al. (2022)."