DAVID: An open-source platform for real-time transformation of infra-segmental emotional cues in running speech

We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a m...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Behavior research methods Ročník 50; číslo 1; s. 323 - 343
Hlavní autori: Rachman, Laura, Liuni, Marco, Arias, Pablo, Lind, Andreas, Johansson, Petter, Hall, Lars, Richardson, Daniel, Watanabe, Katsumi, Dubal, Stéphanie, Aucouturier, Jean-Julien
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York Springer US 01.02.2018
Springer Nature B.V
Psychonomic Society, Inc
Predmet:
ISSN:1554-3528, 1554-351X, 1554-3528
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a microphone, with less than 20-ms latency. We anticipate that this tool will be useful for the study of emotions in psychology and neuroscience, because it enables a high level of control over the acoustical and emotional content of experimental stimuli in a variety of laboratory situations, including real-time social situations. We present here results of a series of validation experiments aiming to position the tool against several methodological requirements: that transformed emotions be recognized at above-chance levels, valid in several languages (French, English, Swedish, and Japanese) and with a naturalness comparable to natural speech.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-017-0873-y