Core Concepts
Adapting large-scale diffusion models for style transfer without optimization.
Abstract
Abstract: Introduces a novel artistic style transfer method based on a pre-trained large-scale diffusion model without optimization.
Introduction: Discusses recent advances in diffusion models and their applications in generative tasks.
Method: Describes the approach of manipulating self-attention features for style transfer, along with query preservation and attention temperature scaling.
Experiments: Details the experimental settings, evaluation protocol, quantitative comparisons with conventional and diffusion-based methods, qualitative comparisons, ablation studies, and additional analysis.
Conclusion: Highlights the proposed method's superiority over state-of-the-art techniques in previous baselines.
Stats
"Our main contributions are summarized as follows:"
"Extensive experiments on the style transfer dataset validate the proposed method significantly outperforms previous methods and achieves state-of-the-art performance."
"Our method requires a total of 12.4 seconds for inference."
Quotes
"We propose a style transfer method exploiting the large-scale pre-trained DM by simple manipulation of the features in self-attention."
"Our main contributions are summarized as follows:"