The paper introduces a novel algorithmic framework called the Transformed Gradient Projection (TGP) algorithm to address optimization problems on compact matrix manifolds. The key innovation in the TGP approach lies in the utilization of a new class of search directions and various stepsizes, including the Armijo, nonmonotone Armijo, and fixed stepsizes, to guide the selection of the next iterate.
The authors focus on the Stiefel and Grassmann manifolds as significant examples, revealing that many existing algorithms in the literature can be seen as specific instances within the proposed TGP framework. The TGP framework also induces several new special cases.
The paper conducts a thorough exploration of the convergence properties of the TGP algorithms, considering various search directions and stepsizes. The authors extensively analyze the geometric properties of the projection onto compact matrix manifolds, allowing them to extend classical inequalities related to retractions from the literature. Building upon these insights, the authors establish the weak convergence, convergence rate, and global convergence of TGP algorithms under three distinct stepsizes.
In cases where the compact matrix manifold is the Stiefel or Grassmann manifold, the convergence results either encompass or surpass those found in the literature. Finally, through numerical experiments, the authors observe that the TGP algorithms, owing to their increased flexibility in choosing search directions, outperform classical gradient projection and retraction-based line-search algorithms in several scenarios.
Til et annet språk
fra kildeinnhold
arxiv.org
Viktige innsikter hentet fra
by Wentao Ding,... klokken arxiv.org 05-01-2024
https://arxiv.org/pdf/2404.19392.pdfDypere Spørsmål