DAVID: An open-source platform for real-time transformation of infra-segmental emotional cues in running speech

We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Behavior research methods Jg. 50; H. 1; S. 323 - 343
Hauptverfasser: Rachman, Laura, Liuni, Marco, Arias, Pablo, Lind, Andreas, Johansson, Petter, Hall, Lars, Richardson, Daniel, Watanabe, Katsumi, Dubal, Stéphanie, Aucouturier, Jean-Julien
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York Springer US 01.02.2018
Springer Nature B.V
Psychonomic Society, Inc
Schlagworte:
ISSN:1554-3528, 1554-351X, 1554-3528
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a microphone, with less than 20-ms latency. We anticipate that this tool will be useful for the study of emotions in psychology and neuroscience, because it enables a high level of control over the acoustical and emotional content of experimental stimuli in a variety of laboratory situations, including real-time social situations. We present here results of a series of validation experiments aiming to position the tool against several methodological requirements: that transformed emotions be recognized at above-chance levels, valid in several languages (French, English, Swedish, and Japanese) and with a naturalness comparable to natural speech.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-017-0873-y