Core Concepts
Developing effective and inclusive explainable autonomous vehicle systems by understanding the diverse needs of stakeholders, generating timely and human-friendly explanations, and enabling continuous learning.
Abstract
The review presents a comprehensive analysis of the current state of research on explainable autonomous vehicle (AV) systems. It identifies three primary topics: explanatory tasks, explanatory information, and explanatory information communication.
Explanatory Tasks:
The need for explanations varies depending on the stakeholders (internal and external), driving operations (perception, planning, localization, control), and the level of vehicle autonomy.
Explanations can be proactive (anticipating future needs) or reactive (responding to user requests), and can address critical or non-critical situations.
Proactive explanations in non-critical situations help build trust and acceptance, while proactive explanations in critical situations alert and prepare drivers for takeover or communicate the AV's intent to external stakeholders.
Reactive explanations in non-critical situations allow users to understand and potentially influence the AV's driving behavior, while reactive explanations in critical situations can provide evidence for post-incident forensic analysis.
Explanatory Information:
Transparency in AVs can take different forms, ranging from documentation about the system's general principles of operation to responsive user-initiated queries during interaction.
The layers of transparency required vary depending on the task and purpose, and can include information about the AV's perception, planning, localization, and control processes.
The explanatory information should be tailored to the needs of different stakeholders, from technical users to the general public.
Explanatory Information Communication:
Explanations can be communicated to internal stakeholders (drivers and passengers) through in-vehicle interfaces, and to external stakeholders (vulnerable road users) through external displays and signals.
The communication of explanations should be timely, intuitive, and adaptable to the user's level of expertise and the driving context.
Effective communication of explanations is crucial for building trust, fostering collaboration, and ensuring the safe and responsible deployment of AVs.
The review concludes by proposing a comprehensive roadmap for future research, grounded in the principles of responsible research and innovation, to address the challenges associated with implementing explainable AV systems.