The paper presents OmniColor, a novel and efficient algorithm for colorizing point clouds using an independent 360-degree camera. The key highlights are:
OmniColor addresses the challenges of fusing data from LiDARs and cameras, which often lead to unsatisfactory mapping results due to inaccurate camera poses. It proposes a global optimization approach to jointly optimize the poses of all camera frames to maximize the photometric consistency of the colored point cloud.
The method leverages the wide field of view (FOV) of 360-degree cameras to capture surrounding scene information, which helps reduce artifacts from illumination variation and acquire sufficient correspondences for improved adaptability in diverse environments.
OmniColor introduces a point cloud co-visibility estimation approach to mitigate the impact of noise on the point cloud surface, which improves the optimization process.
The approach operates in an off-the-shelf manner, enabling seamless integration with any mobile mapping system while ensuring both convenience and accuracy. Extensive experiments demonstrate its superiority over existing frameworks.
Іншою мовою
із вихідного контенту
arxiv.org
Ключові висновки, отримані з
by Bonan Liu,Gu... о arxiv.org 04-09-2024
https://arxiv.org/pdf/2404.04693.pdfГлибші Запити