Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration

Auditory cues facilitate situational awareness by enabling humans to infer what is happening in the nearby environment. Unlike humans, many robots do not continuously produce perceivable state-expressive sounds. In this work, we propose the use of iconic auditory signals that mimic the sounds produc...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI) s. 434 - 442
Hlavní autori: Cha, Elizabeth, Fitter, Naomi T., Kim, Yunkyung, Fong, Terrence, Matarić, Maja J.
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: New York, NY, USA ACM 26.02.2018
Edícia:ACM Conferences
Predmet:
ISBN:9781450349536, 1450349536
ISSN:2167-2148
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Auditory cues facilitate situational awareness by enabling humans to infer what is happening in the nearby environment. Unlike humans, many robots do not continuously produce perceivable state-expressive sounds. In this work, we propose the use of iconic auditory signals that mimic the sounds produced by a robot»s operations. In contrast to artificial sounds (e.g., beeps and whistles), these signals are primarily functional, providing information about the robot»s actions and state. We analyze the effects of two variations of robot sound, tonal and broadband, on auditory localization during a human-robot collaboration task. Results from 24 participants show that both signals significantly improve auditory localization, but the broadband variation is preferred by participants. We then present a computational formulation for auditory signaling and apply it to the problem of auditory localization using a human-subjects data collection with 18 participants to learn optimal signaling policies.
ISBN:9781450349536
1450349536
ISSN:2167-2148
DOI:10.1145/3171221.3171285