Conceitos Básicos
This work proposes randomized algorithms for efficiently computing low-rank approximations of parameter-dependent matrices, which arise in various application areas such as computational statistics and dynamical systems. The key idea is to use constant dimension reduction matrices (DRMs) instead of independent DRMs for each parameter value, leading to computationally attractive methods, especially when the parameter-dependent matrix admits an affine linear decomposition.
Resumo
This work considers the problem of computing low-rank approximations of a matrix A(t) that depends on a parameter t in a compact set D ⊂ Rd. Such parameter-dependent matrices arise in various application areas, including Gaussian process regression, time-dependent data from dynamical systems, image processing, and natural language processing.
The authors propose to extend two popular randomized algorithms, the randomized singular value decomposition (HMT method) and the generalized Nyström method, to the parameter-dependent setting. The key idea is to use constant dimension reduction matrices (DRMs) that do not depend on the parameter t, in contrast to the standard approach of using independent DRMs for each parameter value.
The use of constant DRMs leads to significant computational savings, especially when A(t) admits an affine linear decomposition with respect to t. The authors provide a probabilistic analysis for both algorithms, deriving bounds on the expected value as well as failure probabilities for the approximation error when using Gaussian random DRMs. The theoretical results and numerical experiments show that the use of constant DRMs does not impair the effectiveness of the methods, and they reliably return quasi-best low-rank approximations.