Core Concepts
The TAIL-Plus dataset provides a comprehensive set of multi-sensor data for developing and evaluating simultaneous localization and mapping (SLAM) algorithms for planetary exploration robots in unstructured, deformable granular environments.
Abstract
The TAIL-Plus dataset is an extension of the previous TAIL (Terrain-Aware MultI-ModaL) dataset, focusing on robot localization and mapping in planetary surface analog environments. The dataset was collected using both wheeled and quadruped robot platforms equipped with a diverse sensor suite, including 3D LiDAR, RGB-D cameras, global-shutter color cameras, RTK-GPS, and IMU.
The dataset features a wide range of conditions, including different types of locomotion, multi-loop trajectories, day-night illumination changes, and varying surface terrain characteristics (coarse and fine sand). These challenging scenarios are designed to test the robustness and accuracy of multi-sensor SLAM algorithms for field robots in unstructured, deformable granular environments, which are representative of planetary exploration tasks.
The dataset provides time-synchronized, spatially-calibrated sensor data, as well as 6-DOF ground truth poses from the IMU-integrated RTK-GPS system. This comprehensive dataset aims to support the development and evaluation of various multi-sensor SLAM approaches, such as LiDAR-inertial, visual-inertial, and LiDAR-visual-inertial fusion.
The authors plan to further expand the dataset by incorporating additional sensor modalities, such as event cameras, infrared thermal cameras, and solid-state LiDARs, as well as exploring more diverse and challenging environments for planetary exploration analog scenarios.
Stats
The average commanded linear velocity for the wheeled robot ranges from 0.1 m/s to 1.2 m/s, while the quadruped robot has a velocity range of 0.3 m/s to 0.6 m/s.
The total sequence duration ranges from 74 seconds to 407 seconds.
Quotes
"Considering these, we utilize both wheeled and legged robots to develop datasets for multi-sensor fusion SLAM in unstructured, deformable granular environments."
"We hope the release of our datasets can help researchers developing multi-sensor localization and mapping algorithms for field robot perception in unstructured granular environments."