Planning with Partner Uncertainty Modeling for Efficient Information Revealing in Teamwork

Communication among team members is important for efficient teamwork, to coordinate behavior and ensure that all team members have the information they need to complete the task. To enable effective communication and thus efficient teamwork, we propose a multi-agent planning approach to revealing in...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI) s. 319 - 327
Hlavní autoři: Lo, Shih-Yun, Short, Elaine Schaertl, Thomaz, Andrea L.
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: New York, NY, USA ACM 09.03.2020
Edice:ACM Conferences
Témata:
ISBN:1450367461, 9781450367462
ISSN:2167-2148
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Communication among team members is important for efficient teamwork, to coordinate behavior and ensure that all team members have the information they need to complete the task. To enable effective communication and thus efficient teamwork, we propose a multi-agent planning approach to revealing information based on its benefit to joint team performance. By explicitly modeling the partner's knowledge and behavior, our approach allows a robot in a team to reason about when information is useful, how the communication is effective, and to communicate through efficient actions. That is, the robot provides only the necessary information for task completion, provides the information at the time that it is needed, and through the action(s) that optimizes team performance. We validated this approach in a human study in which participants walk together with a robot to a destination that is known only to the robot. We compared to a legible motion generation approach, and showed that users perceived our approach as more natural, socially appropriate, and fluent to team with, while being both more predictable and intent-clear. The ratings of our approach are equal or higher than legible motion across all 18 survey items.
ISBN:1450367461
9781450367462
ISSN:2167-2148
DOI:10.1145/3319502.3374827