Core Concepts
This research paper presents a theoretical framework for extending diffusion-based generative models from finite-dimensional to infinite-dimensional function spaces using stochastic optimal control (SOC) and applies it to tasks like resolution-free image translation and Bayesian posterior sampling for stochastic processes.
Stats
The authors demonstrate their bridge matching algorithm on a task of bridging probability density functions, showing the progression from a ring-shaped density to a Gaussian mixture density.
In 1D function generation experiments, the method achieves comparable performance to baseline infinite-dimensional methods on datasets like Quadratic, Melbourne, and Gridwatch, as evaluated by the power of a kernel two-sample hypothesis test.
For unpaired image transfer, the proposed DBFS model shows comparable FID scores to finite-dimensional baselines on tasks like EMNIST to MNIST and AFHQ-64 Wild to Cat, demonstrating its ability to generate images at unseen resolutions.
In Bayesian learning experiments, the DBFS model outperforms previous diffusion-based imputation methods like CSDI and DSDP-GP on the Physionet medical time-series dataset, achieving lower RMSE scores for various degrees of missingness.
The method also shows promising results in functional regression tasks, achieving competitive log-likelihood scores compared to CNP and NP models on synthetic data generated from Gaussian Processes with different covariance kernels.