Core Concepts
The author proposes PlanKD, a knowledge distillation method tailored for compressing end-to-end motion planners. By focusing on distilling planning-relevant features and incorporating a safety-aware waypoint-attentive mechanism, PlanKD offers a portable and safe solution for resource-limited deployment.
Abstract
The content introduces PlanKD, a novel knowledge distillation framework designed to compress end-to-end motion planners efficiently. By distilling planning-relevant information and prioritizing safety-critical waypoints, PlanKD significantly improves the performance of smaller planners while reducing reference time by approximately 50%. Extensive experiments demonstrate the effectiveness of PlanKD in enhancing the safety and portability of autonomous driving systems.
Stats
Inference Time (ms / frame): 78.3, 39.7, 22.8, 17.2, 10.7, 8.5, 7.2
Driving Score: 53.44, 36.55, 55.90, 17.12, 28.79, 11.96, 26.15
Collision Rate (#/km): 0.090, 0.121, 0.094, 0.362, 0.315,1 .117 ,0 .361
Quotes
"PlanKD can boost the performance of smaller planners by a large margin."
"Our method can lower the reference time by approximately 50%."
"Experiments illustrate that our PlanKD can improve the performance of smaller planners by a large margin."