Efficient approach for EEG-based emotion recognition

Identification of human emotion involving electroencephalogram (EEG) signals has become an emerging field in health monitoring application as EEG signals can give us a more diverse insight on emotional states. The aim of this study is to develop an efficient framework based on deep learning concept...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Electronics letters Jg. 56; H. 25; S. 1361 - 1364
Hauptverfasser: Şengür, D, Siuly, S
Format: Journal Article
Sprache:Englisch
Veröffentlicht: The Institution of Engineering and Technology 10.12.2020
Schlagworte:
ISSN:0013-5194, 1350-911X, 1350-911X
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Identification of human emotion involving electroencephalogram (EEG) signals has become an emerging field in health monitoring application as EEG signals can give us a more diverse insight on emotional states. The aim of this study is to develop an efficient framework based on deep learning concept for automatic identification of human emotion from EEG signals. In the proposed framework, the signals are pre-processing for removing noises by low-pass filtering and then delta rhythm is extracted. After that, the extracted rhythm signals are converted into the EEG rhythm images by employing the continuous wavelet transform and then deep features are discovered by using a pre-trained convolutional neural networks model. Afterwards, MobileNetv2 is used for deep feature selection to obtain the most efficient features and finally, long short term memory method is employed for classification of selected features. The proposed methodology is tested on ‘DEAP EEG data set’ (publicly available). This study considers two emotions namely ‘Valence’ and ‘Arousal’ for classification. The experimental results demonstrate that the proposed approach produced accuracies of 96.1% for low/high valence and 99.6% for low/high arousal classification. A further comparison of the proposed method is also carried out and it is seen that the proposed method outperforms other compared methods.
ISSN:0013-5194
1350-911X
1350-911X
DOI:10.1049/el.2020.2685