Algorithmically Enhanced Wearable Multimodal Emotion Sensor

Uloženo v:
Podrobná bibliografie
Název: Algorithmically Enhanced Wearable Multimodal Emotion Sensor
Autoři: Anand Babu, Getnet Kassahun, Isabelle Dufour, Dipankar Mandal, Damien Thuau
Zdroj: Advanced Intelligent Systems, Vol 7, Iss 5, Pp n/a-n/a (2025)
Informace o vydavateli: Wiley, 2025.
Rok vydání: 2025
Sbírka: LCC:Computer engineering. Computer hardware
Témata: human–machine interactions, machine learning, sensors, wearable, Computer engineering. Computer hardware, TK7885-7895, Control engineering systems. Automatic machinery (General), TJ212-225
Popis: Sensing diverse human emotions offers significant potential for understanding deeper cognitive processes and diagnosing neurological diseases. However, capturing the full spectrum of emotions remains a significant challenge due to their intricate and subjective nature. Traditional emotion‐sensing techniques often focus on singular emotions, limiting their ability to grasp the complexity of emotional experiences. Herein, a fully printed, organic wearable sensor capable of multimodal emotion sensing by noninvasively monitoring physiological indicators such as heart rate, breathing patterns, and voice signatures is presented. The recorded signals are processed using a long short‐term memory (LSTM) neural network, achieving over 91% classification accuracy in distinguishing different emotions through a data fusion approach, showing >9% enhancement in accuracy as compared to feature fusion. As a proof of concept, Q‐learning is implemented with the data to simulate emotional responses in a robotic model. The study's approach provides a pathway to understanding complex human emotions and enhances the capabilities of effective human–machine interaction.
Druh dokumentu: article
Popis souboru: electronic resource
Jazyk: English
ISSN: 2640-4567
Relation: https://doaj.org/toc/2640-4567
DOI: 10.1002/aisy.202400567
Přístupová URL adresa: https://doaj.org/article/b5e738997a3a430480910808a6752e28
Přístupové číslo: edsdoj.b5e738997a3a430480910808a6752e28
Databáze: Directory of Open Access Journals
Popis
Abstrakt:Sensing diverse human emotions offers significant potential for understanding deeper cognitive processes and diagnosing neurological diseases. However, capturing the full spectrum of emotions remains a significant challenge due to their intricate and subjective nature. Traditional emotion‐sensing techniques often focus on singular emotions, limiting their ability to grasp the complexity of emotional experiences. Herein, a fully printed, organic wearable sensor capable of multimodal emotion sensing by noninvasively monitoring physiological indicators such as heart rate, breathing patterns, and voice signatures is presented. The recorded signals are processed using a long short‐term memory (LSTM) neural network, achieving over 91% classification accuracy in distinguishing different emotions through a data fusion approach, showing >9% enhancement in accuracy as compared to feature fusion. As a proof of concept, Q‐learning is implemented with the data to simulate emotional responses in a robotic model. The study's approach provides a pathway to understanding complex human emotions and enhances the capabilities of effective human–machine interaction.
ISSN:26404567
DOI:10.1002/aisy.202400567