Emotion recognition with deep learning using GAMEEMO data set

Emotion recognition is actively used in brain–computer interface, health care, security, e-commerce, education and entertainment applications to increase and control human–machine interaction. Therefore, emotions affect people's lives and decision-making mechanisms throughout their lives. Howev...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Electronics letters Ročník 56; číslo 25; s. 1364 - 1367
Hlavní autoři: Alakus, T. B, Turkoglu, I
Médium: Journal Article
Jazyk:angličtina
Vydáno: The Institution of Engineering and Technology 10.12.2020
Témata:
ISSN:0013-5194, 1350-911X, 1350-911X
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Emotion recognition is actively used in brain–computer interface, health care, security, e-commerce, education and entertainment applications to increase and control human–machine interaction. Therefore, emotions affect people's lives and decision-making mechanisms throughout their lives. However, the fact that emotions vary from person to person, being an abstract concept and being dependent on internal and external factors makes the studies in this field difficult. In recent years, studies based on electroencephalography (EEG) signals, which perform emotion analysis in a more robust and reliable way, have gained momentum. In this article, emotion analysis based on EEG signals was performed to predict positive and negative emotions. The study consists of four parts. In the first part, EEG signals were obtained from the GAMEEMO data set. In the second stage, the spectral entropy values of the EEG signals of all channels were calculated and these values were classified by the bidirectional long-short term memory architecture in the third stage. In the last stage, the performance of the deep-learning architecture was evaluated with accuracy, sensitivity, specificity and receiver operating characteristic (ROC) curve. With the proposed method, an accuracy of 76.91% and a ROC value of 90% were obtained.
ISSN:0013-5194
1350-911X
1350-911X
DOI:10.1049/el.2020.2460