Core Concepts
EdgeRelight360 enables real-time video portrait relighting on mobile devices by leveraging text-conditioned generation of 360-degree high dynamic range image (HDRI) maps.
Abstract
The paper presents EdgeRelight360, an approach for real-time video portrait relighting on mobile devices. The key components are:
-
Text-Conditioned 360-Degree HDRI Map Generation:
- The authors leverage the generative capabilities of Stable Diffusion to produce 360-degree HDRI maps by training it on 8-bit quantized HDRI maps following the HDR10 standard.
- This allows for the generation of diverse and realistic environment maps from text prompts.
-
Lightweight Video Relighting Framework:
- The authors propose a light-weight video relighting pipeline that combines a normal estimation network and a light adding based rendering approach.
- This enables realistic, fast, and temporally consistent relighting results for in-the-wild portrait videos.
-
On-Device Inference:
- The proposed framework is designed for efficient on-device deployment, leveraging network quantization and real-time rendering.
- This ensures privacy, low runtime, and immediate response to changes in lighting conditions or user inputs.
The authors demonstrate the effectiveness, efficiency, and generalization of their approach through quantitative and qualitative evaluations. The proposed system paves the way for new possibilities in real-time video applications, including video conferencing, gaming, and augmented reality, by allowing dynamic, text-based control of lighting conditions.
Stats
The paper does not provide any specific numerical data or statistics in the main text. The focus is on the technical approach and qualitative results.
Quotes
The paper does not contain any striking quotes that support the key logics.