Cross-Dataset Emotion Valence Prediction Approach from 4-Channel EEG: CNN Model and Multi-Modal Evaluation

Emotion recognition based on electroencephalography (EEG) has gained significant attention due to its potential applications in human–computer interaction, affective computing, and mental health assessment. This study presents a convolutional neural network-based approach to emotion valence predicti...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Big data and cognitive computing Jg. 9; H. 11; S. 280
Hauptverfasser: Romaniuk, Vladimir, Kashevnik, Alexey
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Basel MDPI AG 01.11.2025
Schlagworte:
ISSN:2504-2289, 2504-2289
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Emotion recognition based on electroencephalography (EEG) has gained significant attention due to its potential applications in human–computer interaction, affective computing, and mental health assessment. This study presents a convolutional neural network-based approach to emotion valence prediction model development using 4-channel headband EEG data as well as its evaluation based on computer vision emotion valence recognition. We trained a model on the publicly available FACED and SEED datasets and tested it on a newly collected dataset recorded using a wearable BrainBit headband. The model’s performance is evaluated using both standard train–validation–test splitting and a leave-one-subject-out cross-validation strategy. Additionally, the model is evaluated on computer vision-based emotion recognition system to assess the reliability and consistency of EEG-based emotion prediction. Experimental results demonstrate that the CNN model achieves competitive accuracy in predicting emotion valence from EEG signals, despite the challenges posed by limited channel availability and individual variability. The findings show the usability of compact EEG devices for real-time emotion recognition and their potential integration into adaptive user interfaces and mental health applications.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2504-2289
2504-2289
DOI:10.3390/bdcc9110280