DAVID: An open-source platform for real-time transformation of infra-segmental emotional cues in running speech

We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a m...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Behavior research methods Ročník 50; číslo 1; s. 323 - 343
Hlavní autoři: Rachman, Laura, Liuni, Marco, Arias, Pablo, Lind, Andreas, Johansson, Petter, Hall, Lars, Richardson, Daniel, Watanabe, Katsumi, Dubal, Stéphanie, Aucouturier, Jean-Julien
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.02.2018
Springer Nature B.V
Psychonomic Society, Inc
Témata:
ISSN:1554-3528, 1554-351X, 1554-3528
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a microphone, with less than 20-ms latency. We anticipate that this tool will be useful for the study of emotions in psychology and neuroscience, because it enables a high level of control over the acoustical and emotional content of experimental stimuli in a variety of laboratory situations, including real-time social situations. We present here results of a series of validation experiments aiming to position the tool against several methodological requirements: that transformed emotions be recognized at above-chance levels, valid in several languages (French, English, Swedish, and Japanese) and with a naturalness comparable to natural speech.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-017-0873-y