Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration

Auditory cues facilitate situational awareness by enabling humans to infer what is happening in the nearby environment. Unlike humans, many robots do not continuously produce perceivable state-expressive sounds. In this work, we propose the use of iconic auditory signals that mimic the sounds produc...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI) s. 434 - 442
Hlavní autoři: Cha, Elizabeth, Fitter, Naomi T., Kim, Yunkyung, Fong, Terrence, Matarić, Maja J.
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: New York, NY, USA ACM 26.02.2018
Edice:ACM Conferences
Témata:
ISBN:9781450349536, 1450349536
ISSN:2167-2148
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Auditory cues facilitate situational awareness by enabling humans to infer what is happening in the nearby environment. Unlike humans, many robots do not continuously produce perceivable state-expressive sounds. In this work, we propose the use of iconic auditory signals that mimic the sounds produced by a robot»s operations. In contrast to artificial sounds (e.g., beeps and whistles), these signals are primarily functional, providing information about the robot»s actions and state. We analyze the effects of two variations of robot sound, tonal and broadband, on auditory localization during a human-robot collaboration task. Results from 24 participants show that both signals significantly improve auditory localization, but the broadband variation is preferred by participants. We then present a computational formulation for auditory signaling and apply it to the problem of auditory localization using a human-subjects data collection with 18 participants to learn optimal signaling policies.
ISBN:9781450349536
1450349536
ISSN:2167-2148
DOI:10.1145/3171221.3171285