Coherent 3D Gaussian Splatting for Sparse Novel View Synthesis
We propose a regularized optimization approach to enable 3D Gaussian Splatting (3DGS) for sparse input views. Our key idea is to introduce coherency to the 3D Gaussians during optimization by constraining their movement in 2D image space using an implicit decoder and total variation loss. We further leverage monocular depth and flow correspondences to initialize and regularize the 3D Gaussian representation, enabling high-quality texture and geometry reconstruction from extremely sparse inputs.