DAVID: An open-source platform for real-time transformation of infra-segmental emotional cues in running speech

We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a m...

Full description

Saved in:
Bibliographic Details
Published in:Behavior research methods Vol. 50; no. 1; pp. 323 - 343
Main Authors: Rachman, Laura, Liuni, Marco, Arias, Pablo, Lind, Andreas, Johansson, Petter, Hall, Lars, Richardson, Daniel, Watanabe, Katsumi, Dubal, Stéphanie, Aucouturier, Jean-Julien
Format: Journal Article
Language:English
Published: New York Springer US 01.02.2018
Springer Nature B.V
Psychonomic Society, Inc
Subjects:
ISSN:1554-3528, 1554-351X, 1554-3528
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present an open-source software platform that transforms emotional cues expressed by speech signals using audio effects like pitch shifting, inflection, vibrato, and filtering. The emotional transformations can be applied to any audio file, but can also run in real time, using live input from a microphone, with less than 20-ms latency. We anticipate that this tool will be useful for the study of emotions in psychology and neuroscience, because it enables a high level of control over the acoustical and emotional content of experimental stimuli in a variety of laboratory situations, including real-time social situations. We present here results of a series of validation experiments aiming to position the tool against several methodological requirements: that transformed emotions be recognized at above-chance levels, valid in several languages (French, English, Swedish, and Japanese) and with a naturalness comparable to natural speech.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1554-3528
1554-351X
1554-3528
DOI:10.3758/s13428-017-0873-y