Alapfogalmak
Continuous normalizing flows (CNFs) are a generative method for learning probability distributions from finite random samples. This work establishes non-asymptotic error bounds for the distribution estimator based on CNFs with linear interpolation and flow matching, under assumptions on the target distribution.
Kivonat
The content presents a theoretical analysis of continuous normalizing flows (CNFs) for learning probability distributions from finite random samples.
Key highlights:
- CNFs use ordinary differential equations to construct a stochastic process that transports a simple source distribution (e.g., Gaussian) to the target distribution.
- The velocity field of the CNF can be estimated using a flow matching method, which solves a least squares problem.
- Three main sources of error are identified: discretization error, error due to velocity estimation, and early stopping error.
- Regularity properties of the velocity field are derived, showing Lipschitz continuity in both time and space variables.
- Non-asymptotic error bounds are established for the distribution estimator based on CNFs with linear interpolation, under assumptions that the target distribution has bounded support, is strongly log-concave, or is a mixture of Gaussians.
- The nonparametric convergence rate of the distribution estimator is shown to be e^O(n^(-1/(d+5))), where n is the sample size and d is the data dimension.
Statisztikák
The target distribution satisfies one of the following conditions: bounded support, strongly log-concave, or mixture of Gaussians.
The sample size is denoted as n.
The data dimension is denoted as d.