Exponentially Fast Distillation of Pretrained Diffusion Models for One-Step Generation
The proposed Score identity Distillation (SiD) method can distill the generative capabilities of pretrained diffusion models into a single-step generator, achieving exponentially fast reduction in Fréchet inception distance (FID) during distillation and surpassing the FID performance of the original teacher diffusion models.