Mangalindan, D. H., & Srivastava, V. Assistance-Seeking in Human-Supervised Autonomy: Role of Trust and Secondary Task Engagement (Extended Version). arXiv preprint arXiv:2405.20118v3 (2024).
This research investigates how a robot's assistance-seeking behavior affects human trust and performance in a dual-task scenario, aiming to design an optimal assistance-seeking policy that maximizes team performance.
The researchers conducted human-subject experiments using a dual-task paradigm where participants supervised a robot collecting objects while simultaneously performing a target-tracking task. They collected data on human trust ratings, task performance, and robot actions. Using this data, they developed and estimated models for human trust dynamics, target-tracking engagement dynamics, and human action selection probability. Finally, they designed an optimal assistance-seeking policy using Model Predictive Control (MPC) based on the estimated models.
By modeling human trust and engagement dynamics, an optimal assistance-seeking policy for robots can be developed to improve overall team performance in collaborative tasks. The policy should adapt to different task complexities and human states, seeking assistance when human trust is low or engagement in the secondary task is compromised.
This research contributes to the field of human-robot collaboration by providing insights into the factors influencing human trust and engagement during collaborative tasks. The proposed MPC-based assistance-seeking policy offers a practical approach to improve the efficiency and effectiveness of human-robot teams.
The study was limited to a specific dual-task scenario. Future research could explore the generalizability of the findings and the policy to other collaborative tasks and environments. Additionally, investigating the impact of different robot communication strategies on human trust and engagement could further enhance the design of assistance-seeking policies.
Іншою мовою
із вихідного контенту
arxiv.org
Ключові висновки, отримані з
by Dong Hae Man... о arxiv.org 10-29-2024
https://arxiv.org/pdf/2405.20118.pdfГлибші Запити