Core Concepts
A novel 3D neural style transfer framework based on 3D Gaussian Splatting (3DGS) representation that enables efficient and flexible stylization of 3D scenes with detailed style features and customizable perceptual control.
Abstract
The paper introduces StylizedGS, a 3D neural style transfer framework that leverages the 3D Gaussian Splatting (3DGS) representation to achieve efficient and controllable stylization of 3D scenes.
Key highlights:
3DGS brings benefits of high efficiency and enables rapid stylization within a minute of training.
Proposes a GS filter to eliminate floaters in the reconstruction, which is crucial for the final stylization effect.
Exploits nearest-neighbor feature matching style loss to capture detailed local style patterns while incorporating a depth preservation loss to maintain geometric content.
Introduces flexible perceptual control over color, scale, and spatial regions during the stylization process, empowering users to create customized artistic expressions.
Extensive experiments demonstrate the effectiveness and efficiency of the proposed method in terms of stylization quality and inference speed compared to existing 3D stylization approaches.
Stats
The paper does not provide any specific numerical data or metrics in the main text. The key quantitative results are reported in the tables.
Quotes
"Our method exhibits a better style match to the style image compared to the others."
"Our method outperforms existing 3D stylization methods in terms of effectiveness and efficiency."