Core Concepts
新しい推定値を使用して、適応的なステップサイズを選択するアルゴリズム。
Abstract
Introduction:
Backtracking linesearch is a common approach in smooth optimization.
The proposed adaptive proximal gradient method eliminates the need for backtracks or function value evaluations.
It adapts stepsize based on local geometry estimates.
Adaptive Proximal Gradient Method:
Algorithmic overview provided.
Preliminary lemmas establish estimates of Lipschitz modulus.
Convergence results show boundedness and convergence to a solution.
Adaptive Three-Term Primal-Dual Methods:
Extension to primal-dual setting for composite minimization problems.
Handles nonsmooth terms and provides convergence results.
Numerical Simulations:
Effectiveness of proposed algorithms demonstrated compared to state of the art.
Conclusions:
Contributions summarized, including nonmonotone adaptive stepsize rule and extension to primal-dual setting.
Stats
この作業は、次のものによって支援されました:Research Foundation Flanders(FWO)博士研究員奨学金12Y7622Nおよび研究プロジェクトG081222N、G033822N、およびG0A0920N;Research Council KU Leuven C1プロジェクト番号C14/18/068;欧州連合ホライズン2020研究・イノベーションプログラムマリー・スクウォドフスカ・キュリー奨学金協定番号953348;日本学術振興会(JSPS)KAKENHI助成金JP21K17710。
Quotes
"Eliminates the need for backtracks or function value evaluations."
"Proposed algorithm does not require any parameter tuning."