Automatic Colorization with Imagination: Generating Diverse and Photorealistic Results
Our framework leverages pre-trained diffusion models to synthesize semantically similar, structurally aligned, and instance-aware colorful reference images, which are then used to guide the colorization of grayscale inputs, enabling diverse, controllable, and photorealistic colorization results.