The paper proposes a novel method for low-light image enhancement that shifts the focus from deterministic pixel-wise comparison to a statistical perspective. The key idea is to introduce spatial entropy into the loss function to measure the distribution difference between predicted images and ground truth images.
To make the spatial entropy differentiable, the authors employ kernel density estimation (KDE) to approximate the probabilities for specific intensity values of each pixel with their neighbor areas. Specifically, they equip the entropy with diffusion models and aim for superior accuracy and enhanced perceptual quality over traditional ℓ1-based noise matching loss.
The experiments evaluate the proposed method on two low-light image datasets (LoL-v1 and LoL-v2-real) and the NTIRE 2024 low-light enhancement challenge. The results demonstrate the effectiveness of the statistic-based entropy loss in improving the perceptual quality of diffusion-based image restoration, as measured by metrics like LPIPS and FID, while maintaining competitive distortion performance (PSNR, SSIM).
The authors also conduct ablation studies to analyze the contribution of the entropy loss. They show that the entropy loss outperforms the traditional ℓ1 loss in terms of both perceptual and distortion metrics when applied to the Refusion diffusion model.
Til et andet sprog
fra kildeindhold
arxiv.org
Vigtigste indsigter udtrukket fra
by Wenyi Lian,W... kl. arxiv.org 04-16-2024
https://arxiv.org/pdf/2404.09735.pdfDybere Forespørgsler