Classification of Music-Evoked Emotions from EEG signals using Self-Organizing Maps

The use of music to address some emotional, cognitive, and physical needs of an individual, as proposed by Music therapy, requires to at least identify the emotional expression that a piece of music evoke in people. The aim of this study is to build a computational system to identify emotions as tem...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2022 International Conference on Electrical, Computer and Energy Technologies (ICECET) S. 1 - 6
Hauptverfasser: Beltran-Velandia, Ferney, Gomez, Jonatan, Suarez, Miguel, Ojeda, Andres, Leon, Elizabeth
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 20.07.2022
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract The use of music to address some emotional, cognitive, and physical needs of an individual, as proposed by Music therapy, requires to at least identify the emotional expression that a piece of music evoke in people. The aim of this study is to build a computational system to identify emotions as temporal representations from Electroencephalography (EEG) signals. First, a set of eight pieces of music are composed to theoretically evoke four emotions in people: happiness, sadness, calmness and anger. Then, EEG signals from 30 participants are recorded, first when they remain silence for one minute (baseline) and then while they are listening the stimuli (the eight pieces of music). After listening each stimulus, participants provide a score in terms of Arousal and Valence through the self-assessment manikin test (SAM). The EEGLife dataset is built by reducing noise and cleaning artifacts from raw EEG signals and using the participant's scores. A set of Self-Organizing Maps (the SOM model) is proposed to classify emotions. A subject-specific training schema with 20% per participant for validation is used not only for the EEGLife dataset, but also for the well-known benchmark DEAP dataset. Cross-correlation and bandpower features are extracted from both datasets to feed the model. A Density Peaks clustering algorithm and a fuzzy system are used to find the final emotional scores for the model. Precision, Recall, F1Score and Accuracy metrics are used to assess the performance of the proposed model. Results show that emotional expression information is extracted easier from EEGLife than DEAP dataset. Finally, the SOM model has an interpretability property, which allows to visually analyze EEG signals in Self-Organizing Maps.
AbstractList The use of music to address some emotional, cognitive, and physical needs of an individual, as proposed by Music therapy, requires to at least identify the emotional expression that a piece of music evoke in people. The aim of this study is to build a computational system to identify emotions as temporal representations from Electroencephalography (EEG) signals. First, a set of eight pieces of music are composed to theoretically evoke four emotions in people: happiness, sadness, calmness and anger. Then, EEG signals from 30 participants are recorded, first when they remain silence for one minute (baseline) and then while they are listening the stimuli (the eight pieces of music). After listening each stimulus, participants provide a score in terms of Arousal and Valence through the self-assessment manikin test (SAM). The EEGLife dataset is built by reducing noise and cleaning artifacts from raw EEG signals and using the participant's scores. A set of Self-Organizing Maps (the SOM model) is proposed to classify emotions. A subject-specific training schema with 20% per participant for validation is used not only for the EEGLife dataset, but also for the well-known benchmark DEAP dataset. Cross-correlation and bandpower features are extracted from both datasets to feed the model. A Density Peaks clustering algorithm and a fuzzy system are used to find the final emotional scores for the model. Precision, Recall, F1Score and Accuracy metrics are used to assess the performance of the proposed model. Results show that emotional expression information is extracted easier from EEGLife than DEAP dataset. Finally, the SOM model has an interpretability property, which allows to visually analyze EEG signals in Self-Organizing Maps.
Author Suarez, Miguel
Ojeda, Andres
Leon, Elizabeth
Gomez, Jonatan
Beltran-Velandia, Ferney
Author_xml – sequence: 1
  givenname: Ferney
  surname: Beltran-Velandia
  fullname: Beltran-Velandia, Ferney
  email: fbeltranv@unal.edu.co
  organization: A life and MIDAS Research Groups, Universidad Nacional de Colombia,Colombia,Bogotá
– sequence: 2
  givenname: Jonatan
  surname: Gomez
  fullname: Gomez, Jonatan
  organization: A life and MIDAS Research Groups, Universidad Nacional de Colombia,Colombia,Bogotá
– sequence: 3
  givenname: Miguel
  surname: Suarez
  fullname: Suarez, Miguel
  organization: A life and MIDAS Research Groups, Universidad Nacional de Colombia,Colombia,Bogotá
– sequence: 4
  givenname: Andres
  surname: Ojeda
  fullname: Ojeda, Andres
  organization: A life and MIDAS Research Groups, Universidad Nacional de Colombia,Colombia,Bogotá
– sequence: 5
  givenname: Elizabeth
  surname: Leon
  fullname: Leon, Elizabeth
  organization: A life and MIDAS Research Groups, Universidad Nacional de Colombia,Colombia,Bogotá
BookMark eNotj01qwzAYRFVoF03SE2SjC9iVZP0ui1HTQEIW8T7I9icjYkvBagvt6ZvQrIY3PAZmgR5jioAQpqSklJjXbW1r2wghmCoZYaw0WjHF9ANaUCkFV0Qr84yO9ehyDj507jOkiJPH-68cusJ-pzP02E7p1mfs5zRhazc4hyG6MeOrFQd8hNEXh3lwMfzeeO8ueYWe_NWAl3suUfNum_qj2B022_ptVwSueeF6aaQ0XGjTMWZaplrhORgqK-2F02C0Bl5532luJBDOBWkZ69q-leCor5Zo_T8bAOB0mcPk5p_T_Wb1B7TfTa8
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/ICECET55527.2022.9872728
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Xplore
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Music
EISBN 1665470879
9781665470872
EndPage 6
ExternalDocumentID 9872728
Genre orig-research
GrantInformation_xml – fundername: Ministry of Education
  funderid: 10.13039/501100002701
– fundername: Universidad Nacional de Colombia
  funderid: 10.13039/501100002753
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i484-ad696694589c229b27b5f4e91638f5a8e988e43ffc8496e04450b22cbdb6ea1f3
IEDL.DBID RIE
IngestDate Thu Jan 18 11:14:22 EST 2024
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i484-ad696694589c229b27b5f4e91638f5a8e988e43ffc8496e04450b22cbdb6ea1f3
PageCount 6
ParticipantIDs ieee_primary_9872728
PublicationCentury 2000
PublicationDate 2022-July-20
PublicationDateYYYYMMDD 2022-07-20
PublicationDate_xml – month: 07
  year: 2022
  text: 2022-July-20
  day: 20
PublicationDecade 2020
PublicationTitle 2022 International Conference on Electrical, Computer and Energy Technologies (ICECET)
PublicationTitleAbbrev ICECET
PublicationYear 2022
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.8010248
Snippet The use of music to address some emotional, cognitive, and physical needs of an individual, as proposed by Music therapy, requires to at least identify the...
SourceID ieee
SourceType Publisher
StartPage 1
SubjectTerms Affective Computing
Analytical models
Brain modeling
Computational modeling
EEG signals
Feature extraction
Medical treatment
Music
Self-organizing feature maps
Self-Organizing Maps
Title Classification of Music-Evoked Emotions from EEG signals using Self-Organizing Maps
URI https://ieeexplore.ieee.org/document/9872728
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LTwIxEJ4AevDkA4zv9ODRwtLt7rZnsqgHCQkcuJFtOzVEwxJev9-2u2JMvHjrK2kybTOd9vvmA3hUUREzaVLat7GmXGNClXV7mfPMGne4jEQdxCay0UjMZnLcgKcDFwYRA_gMu74Y_vJNqXf-qazn4mOWMdGEZpalFVfrG5wTyd7rIB_k08SnFHNxH2Pdevgv3ZTgNoan_5vwDDo__DsyPniWc2jg8gKOgiRzGyZBx9IjfIJRSWlJ6KH5vvxAQ_JKmGdDPHOE5Pkz8RgNt8uIx7i_kwl-WlpTMH39rVhtOjAd5tPBC62lEeiCC04Lk7owRfJESM2YVCxTieUo_eXKJoVAKQTy2FotuEwx4jyJFGNaGZVi4ZblElrLcolXQEQ_ZTzWqcG44JnSBSam7wrWBa-xux5dQ9vbZb6qkl_Ma5Pc_N18Cyfe9P7xk0V30Nqud3gPx3q_XWzWD2HFvgDAr5iw
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LbwIhECZWm7SnPrTpuxx6LLqy7C6czVpN1Zi4B29mgaExbVzj6_cXcGvTpJfeeISQzAyBge_jQ-hZBnlIhY5J24SKMAURkcbGMmOJ0XZxaQHKi00koxGfTsW4gl4OXBgA8OAzaLqif8vXhdq6q7KWzY9pQvkRqkWM0WDP1vqG5wSi1e-knTSL3KdiNvOjtFkO-KWc4jeO7tn_pjxHjR8GHh4f9pYLVIHFJap5UeY6mnglS4fx8WbFhcG-h6S74gM0TvfSPGvsuCM4TV-xQ2nYOMMO5f6OJ_BpSEnCdPVhvlw3UNZNs06PlOIIZM44I7mObaIiWMSFolRImsjIMBDueGWinIPgHFhojOJMxBAwFgWSUiW1jCG3jrlC1UWxgGuEeTumLFSxhjBniVQ5RLptC8amr6E9IN2gurPLbLn__mJWmuT27-YndNLLhoPZoD96u0Onzg3uKpQG96i6WW3hAR2r3Wa-Xj16730B4h-b9w
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2022+International+Conference+on+Electrical%2C+Computer+and+Energy+Technologies+%28ICECET%29&rft.atitle=Classification+of+Music-Evoked+Emotions+from+EEG+signals+using+Self-Organizing+Maps&rft.au=Beltran-Velandia%2C+Ferney&rft.au=Gomez%2C+Jonatan&rft.au=Suarez%2C+Miguel&rft.au=Ojeda%2C+Andres&rft.date=2022-07-20&rft.pub=IEEE&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FICECET55527.2022.9872728&rft.externalDocID=9872728