Development of a Universal Validation Protocol and an Open-Source Database for Multi-Contextual Facial Expression Recognition

Facial expression recognition (FER) poses a complex challenge due to diverse factors such as facial morphology variations, lighting conditions, and cultural nuances in emotion representation. To address these hurdles, specific FER algorithms leverage advanced data analysis for inferring emotional st...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Sensors (Basel, Switzerland) Ročník 23; číslo 20; s. 8376
Hlavní autori: La Monica, Ludovica, Cenerini, Costanza, Vollero, Luca, Pennazza, Giorgio, Santonico, Marco, Keller, Flavio
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Basel MDPI AG 10.10.2023
MDPI
Predmet:
ISSN:1424-8220, 1424-8220
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Facial expression recognition (FER) poses a complex challenge due to diverse factors such as facial morphology variations, lighting conditions, and cultural nuances in emotion representation. To address these hurdles, specific FER algorithms leverage advanced data analysis for inferring emotional states from facial expressions. In this study, we introduce a universal validation methodology assessing any FER algorithm’s performance through a web application where subjects respond to emotive images. We present the labelled data database, FeelPix, generated from facial landmark coordinates during FER algorithm validation. FeelPix is available to train and test generic FER algorithms, accurately identifying users’ facial expressions. A testing algorithm classifies emotions based on FeelPix data, ensuring its reliability. Designed as a computationally lightweight solution, it finds applications in online systems. Our contribution improves facial expression recognition, enabling the identification and interpretation of emotions associated with facial expressions, offering profound insights into individuals’ emotional reactions. This contribution has implications for healthcare, security, human-computer interaction, and entertainment.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Current address: National Research Council, Institute of Cognitive Sciences and Technologies, 00185 Rome, Italy.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23208376