GCE: An Audio-Visual Dataset for Group Cohesion and Emotion Analysis.

Uložené v:
Podrobná bibliografia
Názov: GCE: An Audio-Visual Dataset for Group Cohesion and Emotion Analysis.
Autori: Lim, Eunchae, Ho, Ngoc-Huynh, Pant, Sudarshan, Kang, Young-Shin, Jeon, Seong-Eun, Kim, Seungwon, Kim, Soo-Hyung, Yang, Hyung-Jeong
Zdroj: Applied Sciences (2076-3417); Aug2024, Vol. 14 Issue 15, p6742, 21p
Predmety: AFFECTIVE forecasting (Psychology), SOCIAL cohesion, PSYCHOLOGY students, GRADUATE students, EMOTIONS, IMAGE fusion
Abstrakt: We present the Group Cohesion and Emotion (GCE) dataset, which comprises 1029 segmented films sourced from YouTube. These videos encompass a range of interactions, including interviews, meetings, informal discussions, and other similar contexts. In the annotation process, graduate psychology students were tasked with assigning coherence levels, ranging from 1 to 7, and affective states as negative, neutral, or positive for each 30 s film. We introduce a foundational model that utilizes advanced visual and audio embedding techniques to investigate the concepts of group cohesion and group emotion prediction. The application of Multi-Head Attention (MHA) fusion is utilized to enhance the process of cross-representation learning. The scope of our investigation includes both unimodal and multimodal techniques, which provide insights into the prediction of group cohesion and the detection of group emotion. The results emphasize the effectiveness of the GCE dataset in examining the level of group unity and emotional conditions. [ABSTRACT FROM AUTHOR]
Copyright of Applied Sciences (2076-3417) is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Databáza: Complementary Index
Buďte prvý, kto okomentuje tento záznam!
Najprv sa musíte prihlásiť.