Generative adversarial networks (GANs) can be optimized to make the generator distribution close to the target distribution by satisfying metrizable conditions on the discriminator, including direction optimality, separability, and injectivity.
The authors propose several improved techniques, including both the evaluation perspective and training perspective, to allow the likelihood estimation by diffusion ODEs to outperform the existing state-of-the-art likelihood estimators.
Iterative retraining of generative models on a mix of real and synthetic data can be stable, provided the initial generative model is sufficiently well-trained and the proportion of real data is large enough.
새로운 Scalable WGF 기반 생성 모델인 Semi-dual JKO를 소개합니다.
Optimal Transport theory enhances stability and performance in generative modeling.
Introducing a simplified approach to Diffusion Schrödinger Bridge for improved generative modeling.
OT-based GANs benefit from strictly convex functions and cost functions to enhance stability and prevent mode collapse.