The paper introduces a new first-order algorithm for approximately solving linear programs (LPs) that achieves polynomial-time convergence rates. The key innovations are:
The algorithm's convergence rate depends on the circuit imbalance measure of the constraint matrix, rather than the Hoffman constant, which can be much smaller and lead to stronger guarantees.
The algorithm repeatedly calls a fast gradient method (R-FGM) on a carefully designed potential function, and gradually fixes variables to their upper or lower bounds based on primal-dual complementarity conditions.
The algorithm can handle arbitrary linear programs, not just those with totally unimodular constraint matrices, and the running time depends polynomially on the logarithms of the problem parameters, in contrast to previous first-order methods.
The authors also provide a guessing procedure to handle the circuit imbalance measure, which is hard to approximate, without affecting the asymptotic running time.
The algorithm first solves a feasibility problem to find a δ-feasible solution, and then gradually optimizes the objective by fixing variables and updating the cost function. The key technical ingredients are proximity results relating the current solution to the optimal one, and a novel variable fixing scheme based on approximate complementarity conditions.
A otro idioma
del contenido fuente
arxiv.org
Consultas más profundas