The article presents a novel autonomous Micro Aerial Vehicle (MAV) system that relies on passive visual and inertial sensors for under-canopy autonomous navigation in forests. The system utilizes visual-inertial simultaneous localization and mapping (VI-SLAM) for accurate state estimates and incorporates a volumetric occupancy submapping system for scalable mapping. A unique trajectory anchoring scheme is proposed to ensure safe navigation during state updates, especially after loop-closures. The system is validated in both real and simulated forest environments with high tree densities, achieving impressive performance without any collisions or failures.
The work addresses the challenges of precise mapping in vast forest areas by leveraging robotics, environment perception, and data analysis advancements. By using only passive visual sensors, the system aims to create cheaper, lighter, and more scalable drone systems suitable for cluttered environments like forests. The approach focuses on achieving accurate state estimation, dense mapping, and robust flight control capabilities to enable safe autonomous navigation without collisions.
Key contributions include performing under-canopy autonomous navigation with only visual and inertial sensors, introducing trajectory deformation at odometry rate for continuous tracking upon state updates, demonstrating successful missions in dense forest environments at high speeds without incidents. The use of submaps instead of monolithic maps helps handle drift corrections from VI-SLAM effectively while ensuring safe path planning based on accurate online maps generated via SLAM poses.
他の言語に翻訳
原文コンテンツから
arxiv.org
深掘り質問