This paper introduces DT-RaDaR, a novel robot navigation framework that prioritizes user privacy by leveraging RF ray-tracing within a digital twin environment, effectively circumventing the need for privacy-invasive sensors like cameras or LiDAR.
This paper introduces HEIGHT, a novel robot navigation policy network that leverages a heterogeneous spatio-temporal graph to model interactions between robots, humans, and obstacles in crowded and constrained environments, leading to safer and more efficient navigation compared to existing methods.
This research paper introduces FlowNav, a novel approach for robot navigation that leverages Conditional Flow Matching (CFM) to generate efficient and accurate action policies, outperforming traditional diffusion-based methods in terms of speed and achieving comparable accuracy.
This paper proposes a novel method for safe and efficient robot navigation in unknown environments by combining harmonic potential fields for local navigation with oriented search trees for global task planning.
This research introduces NUE, a novel robotic visual navigation system that leverages the uncertainty inherent in online-trained NeRF models to drive exploration, leading to more efficient learning of unknown environments and improved image-goal navigation performance.
NeuronsGym은 로봇 내비게이션 정책 학습 및 시뮬레이션 환경에서 실제 환경으로의 전이 능력을 평가하기 위한 하이브리드 프레임워크 및 벤치마크를 제시하며, 향상된 안전성 평가 지표인 SFPL을 소개하고 다양한 시뮬레이션-실제 환경 전이 방법들의 성능을 비교 분석합니다.
The FRTree Planner enables real-time, collision-free robot navigation in cluttered and unknown environments by dynamically constructing a tree of free regions and using a bi-level trajectory optimization that considers the robot's geometry.
X-MOBILITY, a novel end-to-end navigation model, leverages world modeling and imitation learning to achieve superior performance and generalization capabilities in robot navigation tasks, outperforming existing methods in challenging, out-of-distribution environments.
This paper introduces Active NTFields, a novel framework that leverages physics-informed neural networks to efficiently map unknown environments and generate real-time motion plans for robots by learning arrival time fields directly from sensor data.
OrionNav enables robots to autonomously navigate unknown, dynamic environments and perform diverse tasks by combining real-time open-vocabulary semantic mapping, hierarchical scene graph generation, and LLM-based planning with low-level motion control.