The content presents a method for solving bilevel optimization problems, where the goal is to learn hyperparameters for various data science tasks modeled using variational regularization approaches. The key highlights are:
The authors formulate the hyperparameter learning problem as a bilevel optimization problem, where the upper-level objective is to minimize a loss function over the hyperparameters, and the lower-level problem involves solving for the optimal solution given the hyperparameters.
Due to the large-scale nature of the problems and the use of numerical solvers, computing the exact hypergradient (gradient of the upper-level objective with respect to the hyperparameters) is not feasible. The authors propose an algorithm that relies on inexact function evaluations and hypergradients, and dynamically determines the required accuracy for these quantities.
The authors introduce a verifiable backtracking line search scheme that utilizes only inexact function evaluations and the inexact hypergradient, and guarantees sufficient decrease in the exact upper-level objective function.
The proposed algorithm, called the Method of Adaptive Inexact Descent (MAID), connects the theoretical results on the descent direction and the line search, and provides a robust and efficient method for bilevel learning.
Numerical experiments on various problems, such as multinomial logistic regression and variational image denoising, demonstrate the efficiency and feasibility of the MAID approach, outperforming state-of-the-art methods.
In eine andere Sprache
aus dem Quellinhalt
arxiv.org
Wichtige Erkenntnisse aus
by Mohammad Sad... um arxiv.org 04-12-2024
https://arxiv.org/pdf/2308.10098.pdfTiefere Fragen