Core Concepts
Effective limited-memory methods using compact representations for data-fitting tasks.
Abstract
This article discusses the development of new compact representations parameterized by vectors for large-scale data fitting problems. It explores their effectiveness in eigenvalue computations, tensor factorizations, and nonlinear regressions. The limited-memory approach reduces memory usage and enables efficient operations on large matrices.
Introduction to large-scale data fitting problems.
Unconstrained optimization methods like Newton's method and gradient-based methods.
Compact representation formulas for Hessian matrix approximations.
Implications of compact representations on eigendecomposition and updating techniques.
Numerical experiments demonstrating the scalability and efficacy of compact representations in optimization algorithms.
Comparison of eigenfactorization using thin QR factorization versus eig function in MATLAB.
Stats
Limited-memory parameter: l = 5
Dimensions tested: d ∈ {23, 24, ..., 213}
Rosenbrock function components: f(w) = 100(w2i-1 - w2i)^2 + (w2i-1 - 1)^2