Core Concepts
Iterative solvers for nonlinear magnetostatics exhibit global convergence through generalized gradient descent methods.
Abstract
The article discusses the convergence of iterative solvers for nonlinear magnetostatics.
Various methods like the damped Newton-method, fixed-point iteration, and the Kaˇcanov iteration are explored.
Armijo backtracking is used for adaptive stepsize selection.
Theoretical results extend to approximation schemes like finite elements and isogeometric analysis.
The main theorem proves global convergence of the iterative methods under certain assumptions.
Numerical tests are conducted to validate the theoretical findings.
Different methods show varying convergence rates and computational costs.
Theoretical results are applicable to conforming Galerkin approximations and inexact Galerkin approximations.
The article concludes with acknowledgments and references.
Stats
The fixed-point iteration (with ν = 7.98·10^4) requires most iterations.
The Kaˇcanov iteration uses additional information about material behavior for faster convergence.
The Newton method with line search has the smallest iteration numbers.
Quotes
"The fixed-point iteration used to establish the existence of a unique solution can be viewed as a gradient descent method applied to the minimization problem."
"The Armijo rule selects a stepsize satisfying certain conditions for convergence."
"All methods exhibit global convergence with iteration numbers independent of discretization parameters."