The paper presents a framework for applying training-free guidance to discrete diffusion models, which allows for flexible guidance of the data generation process without the need for additional training. This is demonstrated on molecular graph generation tasks using the discrete diffusion model architecture of DiGress.
The key highlights are:
Training-free guidance methods for continuous data have seen significant interest, as they enable foundation diffusion models to be paired with interchangeable guidance models. However, equivalent guidance methods for discrete diffusion models were previously unknown.
The authors introduce a framework for applying training-free guidance to discrete data, which involves modeling the gradient of the log probability of the target attribute with respect to the noised latent variable at each timestep.
The authors demonstrate the effectiveness of their approach on molecular graph generation tasks, where they guide the generation process to produce molecules with a specific percentage of a given atom type and a target molecular weight for the heavy atoms.
The results show that as the guidance strength (λ) is increased, the generated molecules better match the target attributes while maintaining a high percentage of valid molecules.
The authors discuss the limitations of their approach, which relies on the discrete diffusion model accurately learning the underlying data distribution, and suggest future work to explore analogous assumptions to the continuous case.
Para Outro Idioma
do conteúdo original
arxiv.org
Principais Insights Extraídos De
by Thomas J. Ke... às arxiv.org 09-12-2024
https://arxiv.org/pdf/2409.07359.pdfPerguntas Mais Profundas