Facial Action Unit Detection using 3D Face Landmarks for Pain Detection

Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in...

Full description

Saved in:
Bibliographic Details
Published in:2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC) Vol. 2023; pp. 1 - 5
Main Authors: Feghoul, Kevin, Bouazizi, Mondher, Maia, Deise Santana
Format: Conference Proceeding Journal Article
Language:English
Published: United States IEEE 24.07.2023
Subjects:
ISSN:2694-0604, 2694-0604
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Automatic detection of facial action units (AUs) has recently gained attention for its applications in facial expression analysis. However, using AUs in research can be challenging since they are typically manually annotated, which can be time-consuming, repetitive, and error-prone. Advancements in automated AU detection can greatly reduce the time required for this task and improve the reliability of annotations for downstream tasks, such as pain detection. In this study, we present an efficient method for detecting AUs using only 3D face landmarks. Using the detected AUs, we trained state-of-the-art deep learning models to detect pain, which validates the effectiveness of the AU detection model. Our study also establishes a new benchmark for pain detection on the BP4D+ dataset, demonstrating an 11.13% improvement in F1-score and a 3.09% improvement in accuracy using a Transformer model compared to existing studies. Our results show that utilizing only eight predicted AUs still achieves competitive results when compared to using all 34 ground-truth AUs.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2694-0604
2694-0604
DOI:10.1109/EMBC40787.2023.10340059