A Comprehensive Multimodal Dataset and Method for Capturing and Analyzing Complex Human Motions
RELI11D is a high-quality multimodal human motion dataset that provides synchronized LiDAR, RGB, IMU, and Event data, enabling comprehensive understanding of complex and rapid human movements. The authors propose LEIR, a multimodal baseline that effectively integrates the geometric, appearance, and motion dynamics information from these modalities to achieve promising results for human pose estimation and global trajectory prediction.