Multimodal Pain Recognition in Postoperative Patients: Machine Learning Approach

Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:JMIR formative research Jg. 9; S. e67969
Hauptverfasser: Subramanian, Ajan, Cao, Rui, Naeini, Emad Kasaeyan, Aqajari, Seyed Amir Hossein, Hughes, Thomas D, Calderon, Michael-David, Zheng, Kai, Dutt, Nikil, Liljeberg, Pasi, Salanterä, Sanna, Nelson, Ariana M, Rahmani, Amir M
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Canada JMIR Publications 27.01.2025
Schlagworte:
ISSN:2561-326X, 2561-326X
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsistencies in pain management. Multimodal pain assessment, integrating physiological and behavioral data, presents an opportunity to create more objective and accurate pain measurement systems. However, most previous work has focused on healthy subjects in controlled environments, with limited attention to real-world postoperative pain scenarios. This gap necessitates the development of robust, multimodal approaches capable of addressing the unique challenges associated with assessing pain in clinical settings, where factors like motion artifacts, imbalanced label distribution, and sparse data further complicate pain monitoring. This study aimed to develop and evaluate a multimodal machine learning-based framework for the objective assessment of pain in postoperative patients in real clinical settings using biosignals such as electrocardiogram, electromyogram, electrodermal activity, and respiration rate (RR) signals. The iHurt study was conducted on 25 postoperative patients at the University of California, Irvine Medical Center. The study captured multimodal biosignals during light physical activities, with concurrent self-reported pain levels using the Numerical Rating Scale. Data preprocessing involved noise filtering, feature extraction, and combining handcrafted and automatic features through convolutional and long-short-term memory autoencoders. Machine learning classifiers, including support vector machine, random forest, adaptive boosting, and k-nearest neighbors, were trained using weak supervision and minority oversampling to handle sparse and imbalanced pain labels. Pain levels were categorized into baseline and 3 levels of pain intensity (1-3). The multimodal pain recognition models achieved an average balanced accuracy of over 80% across the different pain levels. RR models consistently outperformed other single modalities, particularly for lower pain intensities, while facial muscle activity (electromyogram) was most effective for distinguishing higher pain intensities. Although single-modality models, especially RR, generally provided higher performance compared to multimodal approaches, our multimodal framework still delivered results that surpassed most previous works in terms of overall accuracy. This study presents a novel, multimodal machine learning framework for objective pain recognition in postoperative patients. The results highlight the potential of integrating multiple biosignal modalities for more accurate pain assessment, with particular value in real-world clinical settings.
AbstractList Background:Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsistencies in pain management. Multimodal pain assessment, integrating physiological and behavioral data, presents an opportunity to create more objective and accurate pain measurement systems. However, most previous work has focused on healthy subjects in controlled environments, with limited attention to real-world postoperative pain scenarios. This gap necessitates the development of robust, multimodal approaches capable of addressing the unique challenges associated with assessing pain in clinical settings, where factors like motion artifacts, imbalanced label distribution, and sparse data further complicate pain monitoring.Objective:This study aimed to develop and evaluate a multimodal machine learning–based framework for the objective assessment of pain in postoperative patients in real clinical settings using biosignals such as electrocardiogram, electromyogram, electrodermal activity, and respiration rate (RR) signals.Methods:The iHurt study was conducted on 25 postoperative patients at the University of California, Irvine Medical Center. The study captured multimodal biosignals during light physical activities, with concurrent self-reported pain levels using the Numerical Rating Scale. Data preprocessing involved noise filtering, feature extraction, and combining handcrafted and automatic features through convolutional and long-short-term memory autoencoders. Machine learning classifiers, including support vector machine, random forest, adaptive boosting, and k-nearest neighbors, were trained using weak supervision and minority oversampling to handle sparse and imbalanced pain labels. Pain levels were categorized into baseline and 3 levels of pain intensity (1-3).Results:The multimodal pain recognition models achieved an average balanced accuracy of over 80% across the different pain levels. RR models consistently outperformed other single modalities, particularly for lower pain intensities, while facial muscle activity (electromyogram) was most effective for distinguishing higher pain intensities. Although single-modality models, especially RR, generally provided higher performance compared to multimodal approaches, our multimodal framework still delivered results that surpassed most previous works in terms of overall accuracy.Conclusions:This study presents a novel, multimodal machine learning framework for objective pain recognition in postoperative patients. The results highlight the potential of integrating multiple biosignal modalities for more accurate pain assessment, with particular value in real-world clinical settings.
Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsistencies in pain management. Multimodal pain assessment, integrating physiological and behavioral data, presents an opportunity to create more objective and accurate pain measurement systems. However, most previous work has focused on healthy subjects in controlled environments, with limited attention to real-world postoperative pain scenarios. This gap necessitates the development of robust, multimodal approaches capable of addressing the unique challenges associated with assessing pain in clinical settings, where factors like motion artifacts, imbalanced label distribution, and sparse data further complicate pain monitoring.BACKGROUNDAcute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsistencies in pain management. Multimodal pain assessment, integrating physiological and behavioral data, presents an opportunity to create more objective and accurate pain measurement systems. However, most previous work has focused on healthy subjects in controlled environments, with limited attention to real-world postoperative pain scenarios. This gap necessitates the development of robust, multimodal approaches capable of addressing the unique challenges associated with assessing pain in clinical settings, where factors like motion artifacts, imbalanced label distribution, and sparse data further complicate pain monitoring.This study aimed to develop and evaluate a multimodal machine learning-based framework for the objective assessment of pain in postoperative patients in real clinical settings using biosignals such as electrocardiogram, electromyogram, electrodermal activity, and respiration rate (RR) signals.OBJECTIVEThis study aimed to develop and evaluate a multimodal machine learning-based framework for the objective assessment of pain in postoperative patients in real clinical settings using biosignals such as electrocardiogram, electromyogram, electrodermal activity, and respiration rate (RR) signals.The iHurt study was conducted on 25 postoperative patients at the University of California, Irvine Medical Center. The study captured multimodal biosignals during light physical activities, with concurrent self-reported pain levels using the Numerical Rating Scale. Data preprocessing involved noise filtering, feature extraction, and combining handcrafted and automatic features through convolutional and long-short-term memory autoencoders. Machine learning classifiers, including support vector machine, random forest, adaptive boosting, and k-nearest neighbors, were trained using weak supervision and minority oversampling to handle sparse and imbalanced pain labels. Pain levels were categorized into baseline and 3 levels of pain intensity (1-3).METHODSThe iHurt study was conducted on 25 postoperative patients at the University of California, Irvine Medical Center. The study captured multimodal biosignals during light physical activities, with concurrent self-reported pain levels using the Numerical Rating Scale. Data preprocessing involved noise filtering, feature extraction, and combining handcrafted and automatic features through convolutional and long-short-term memory autoencoders. Machine learning classifiers, including support vector machine, random forest, adaptive boosting, and k-nearest neighbors, were trained using weak supervision and minority oversampling to handle sparse and imbalanced pain labels. Pain levels were categorized into baseline and 3 levels of pain intensity (1-3).The multimodal pain recognition models achieved an average balanced accuracy of over 80% across the different pain levels. RR models consistently outperformed other single modalities, particularly for lower pain intensities, while facial muscle activity (electromyogram) was most effective for distinguishing higher pain intensities. Although single-modality models, especially RR, generally provided higher performance compared to multimodal approaches, our multimodal framework still delivered results that surpassed most previous works in terms of overall accuracy.RESULTSThe multimodal pain recognition models achieved an average balanced accuracy of over 80% across the different pain levels. RR models consistently outperformed other single modalities, particularly for lower pain intensities, while facial muscle activity (electromyogram) was most effective for distinguishing higher pain intensities. Although single-modality models, especially RR, generally provided higher performance compared to multimodal approaches, our multimodal framework still delivered results that surpassed most previous works in terms of overall accuracy.This study presents a novel, multimodal machine learning framework for objective pain recognition in postoperative patients. The results highlight the potential of integrating multiple biosignal modalities for more accurate pain assessment, with particular value in real-world clinical settings.CONCLUSIONSThis study presents a novel, multimodal machine learning framework for objective pain recognition in postoperative patients. The results highlight the potential of integrating multiple biosignal modalities for more accurate pain assessment, with particular value in real-world clinical settings.
BackgroundAcute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsistencies in pain management. Multimodal pain assessment, integrating physiological and behavioral data, presents an opportunity to create more objective and accurate pain measurement systems. However, most previous work has focused on healthy subjects in controlled environments, with limited attention to real-world postoperative pain scenarios. This gap necessitates the development of robust, multimodal approaches capable of addressing the unique challenges associated with assessing pain in clinical settings, where factors like motion artifacts, imbalanced label distribution, and sparse data further complicate pain monitoring. ObjectiveThis study aimed to develop and evaluate a multimodal machine learning–based framework for the objective assessment of pain in postoperative patients in real clinical settings using biosignals such as electrocardiogram, electromyogram, electrodermal activity, and respiration rate (RR) signals. MethodsThe iHurt study was conducted on 25 postoperative patients at the University of California, Irvine Medical Center. The study captured multimodal biosignals during light physical activities, with concurrent self-reported pain levels using the Numerical Rating Scale. Data preprocessing involved noise filtering, feature extraction, and combining handcrafted and automatic features through convolutional and long-short-term memory autoencoders. Machine learning classifiers, including support vector machine, random forest, adaptive boosting, and k-nearest neighbors, were trained using weak supervision and minority oversampling to handle sparse and imbalanced pain labels. Pain levels were categorized into baseline and 3 levels of pain intensity (1-3). ResultsThe multimodal pain recognition models achieved an average balanced accuracy of over 80% across the different pain levels. RR models consistently outperformed other single modalities, particularly for lower pain intensities, while facial muscle activity (electromyogram) was most effective for distinguishing higher pain intensities. Although single-modality models, especially RR, generally provided higher performance compared to multimodal approaches, our multimodal framework still delivered results that surpassed most previous works in terms of overall accuracy. ConclusionsThis study presents a novel, multimodal machine learning framework for objective pain recognition in postoperative patients. The results highlight the potential of integrating multiple biosignal modalities for more accurate pain assessment, with particular value in real-world clinical settings.
Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels effectively. Current methods of pain assessment often rely on subjective patient reports or behavioral pain observation tools, which can lead to inconsistencies in pain management. Multimodal pain assessment, integrating physiological and behavioral data, presents an opportunity to create more objective and accurate pain measurement systems. However, most previous work has focused on healthy subjects in controlled environments, with limited attention to real-world postoperative pain scenarios. This gap necessitates the development of robust, multimodal approaches capable of addressing the unique challenges associated with assessing pain in clinical settings, where factors like motion artifacts, imbalanced label distribution, and sparse data further complicate pain monitoring. This study aimed to develop and evaluate a multimodal machine learning-based framework for the objective assessment of pain in postoperative patients in real clinical settings using biosignals such as electrocardiogram, electromyogram, electrodermal activity, and respiration rate (RR) signals. The iHurt study was conducted on 25 postoperative patients at the University of California, Irvine Medical Center. The study captured multimodal biosignals during light physical activities, with concurrent self-reported pain levels using the Numerical Rating Scale. Data preprocessing involved noise filtering, feature extraction, and combining handcrafted and automatic features through convolutional and long-short-term memory autoencoders. Machine learning classifiers, including support vector machine, random forest, adaptive boosting, and k-nearest neighbors, were trained using weak supervision and minority oversampling to handle sparse and imbalanced pain labels. Pain levels were categorized into baseline and 3 levels of pain intensity (1-3). The multimodal pain recognition models achieved an average balanced accuracy of over 80% across the different pain levels. RR models consistently outperformed other single modalities, particularly for lower pain intensities, while facial muscle activity (electromyogram) was most effective for distinguishing higher pain intensities. Although single-modality models, especially RR, generally provided higher performance compared to multimodal approaches, our multimodal framework still delivered results that surpassed most previous works in terms of overall accuracy. This study presents a novel, multimodal machine learning framework for objective pain recognition in postoperative patients. The results highlight the potential of integrating multiple biosignal modalities for more accurate pain assessment, with particular value in real-world clinical settings.
Author Liljeberg, Pasi
Rahmani, Amir M
Cao, Rui
Subramanian, Ajan
Salanterä, Sanna
Nelson, Ariana M
Aqajari, Seyed Amir Hossein
Naeini, Emad Kasaeyan
Calderon, Michael-David
Dutt, Nikil
Zheng, Kai
Hughes, Thomas D
AuthorAffiliation 8 Turku University Hospital University of Turku Turku Finland
9 Department of Anesthesiology and Pain Medicine University of California, Irvine Irvine, CA United States
2 Department of Electrical Engineering and Computer Science University of California, Irvine Irvine, CA United States
1 Department of Computer Science University of California, Irvine Irvine, CA United States
4 College of Medicine Kansas City University Kansas City, MO United States
5 Department of Informatics University of California, Irvine Irvine, CA United States
10 Institute for Future Health University of California, Irvine Irvine, CA United States
6 Department of Computing University of Turku Turku Finland
3 School of Nursing University of California, Irvine Irvine, CA United States
7 Department of Nursing Science University of Turku Turku Finland
AuthorAffiliation_xml – name: 4 College of Medicine Kansas City University Kansas City, MO United States
– name: 5 Department of Informatics University of California, Irvine Irvine, CA United States
– name: 8 Turku University Hospital University of Turku Turku Finland
– name: 10 Institute for Future Health University of California, Irvine Irvine, CA United States
– name: 9 Department of Anesthesiology and Pain Medicine University of California, Irvine Irvine, CA United States
– name: 3 School of Nursing University of California, Irvine Irvine, CA United States
– name: 7 Department of Nursing Science University of Turku Turku Finland
– name: 1 Department of Computer Science University of California, Irvine Irvine, CA United States
– name: 2 Department of Electrical Engineering and Computer Science University of California, Irvine Irvine, CA United States
– name: 6 Department of Computing University of Turku Turku Finland
Author_xml – sequence: 1
  givenname: Ajan
  orcidid: 0000-0003-3253-1300
  surname: Subramanian
  fullname: Subramanian, Ajan
– sequence: 2
  givenname: Rui
  orcidid: 0000-0001-9295-8299
  surname: Cao
  fullname: Cao, Rui
– sequence: 3
  givenname: Emad Kasaeyan
  orcidid: 0000-0002-7438-2641
  surname: Naeini
  fullname: Naeini, Emad Kasaeyan
– sequence: 4
  givenname: Seyed Amir Hossein
  orcidid: 0000-0003-1747-6980
  surname: Aqajari
  fullname: Aqajari, Seyed Amir Hossein
– sequence: 5
  givenname: Thomas D
  orcidid: 0000-0002-0651-9394
  surname: Hughes
  fullname: Hughes, Thomas D
– sequence: 6
  givenname: Michael-David
  orcidid: 0000-0001-6824-6945
  surname: Calderon
  fullname: Calderon, Michael-David
– sequence: 7
  givenname: Kai
  orcidid: 0000-0003-4121-4948
  surname: Zheng
  fullname: Zheng, Kai
– sequence: 8
  givenname: Nikil
  orcidid: 0000-0002-3060-8119
  surname: Dutt
  fullname: Dutt, Nikil
– sequence: 9
  givenname: Pasi
  orcidid: 0000-0002-9392-3589
  surname: Liljeberg
  fullname: Liljeberg, Pasi
– sequence: 10
  givenname: Sanna
  orcidid: 0000-0003-2529-6699
  surname: Salanterä
  fullname: Salanterä, Sanna
– sequence: 11
  givenname: Ariana M
  orcidid: 0000-0003-1575-1635
  surname: Nelson
  fullname: Nelson, Ariana M
– sequence: 12
  givenname: Amir M
  orcidid: 0000-0003-0725-1155
  surname: Rahmani
  fullname: Rahmani, Amir M
BackLink https://www.ncbi.nlm.nih.gov/pubmed/39869898$$D View this record in MEDLINE/PubMed
BookMark eNpdkm9rFDEQxoNUbK33FWRBBKGcJpvdbOIbKaVq4YqHKPguzGZnrzn2km2SLfjtzfZa6fVN_kx-PDN5Zl6TI-cdErJg9GPJlPgkGiXUC3JS1oIteSn-HD05H5NFjFtKaclYBvkrcsyVFEoqeULW19OQ7M53MBRrsK74icZvnE3WuyJf1z4mP2KAZO8wE8miS_FzcQ3mxjosVgjBWbcpzscx-Bx8Q172MERcPOyn5PfXy18X35erH9-uLs5XS1PVMi17pRpGAXraq4YqVan8j5Yr0bAemQBamwaN5JQqib0pJbCqM2hEnVfeAz8lV3vdzsNWj8HuIPzVHqy-D_iw0RCSNQNqoYBiB7ylXVt1tZSs7Vte0kZl22qYtb7stcap3WFO4FKA4UD08MXZG73xd5oxmT0VZVb48KAQ_O2EMemdjQaHARz6KWrOBK1Etr_K6Ltn6NZPwWWvZqriVNT1TL19WtL_Wh47l4GzPWCCjzFgr41NMLctV2gHzaieR0Pfj0am3z-jHwUPuX96CbZv
CitedBy_id crossref_primary_10_56294_cid2025127
crossref_primary_10_56294_cid2025154
crossref_primary_10_1038_s41598_025_12476_8
Cites_doi 10.1109/CYBConf.2013.6617456
10.1145/3341105.3373945
10.1002/bjs.11477
10.2196/17783
10.2196/25258
10.1109/ICSMC.2005.1571679
10.1109/TAFFC.2017.2784832
10.1109/tbme.2015.2474131
10.1145/2388676.2388688
10.1109/ICPR.2014.784
10.14778/3157794.3157797
10.48550/arXiv.1708.08755
10.1109/EMBC44109.2020.9175247
10.1016/1058-9139(92)90001-S
10.1093/pm/pnx019
10.1016/j.jpainsymman.2018.12.333
10.1016/j.jpain.2015.12.008
10.1109/EMBC46164.2021.9630002
10.48550/arXiv.2112.08176
10.5244/C.27.119
10.1613/jair.953
10.1016/j.pmn.2019.07.005
10.1007/s00482-020-00501-w
10.1109/imtc.2011.5944249
10.1016/j.procs.2021.03.021
10.1097/j.pain.0000000000000160
10.1016/s0005-7967(01)00072-9
10.1016/j.neunet.2014.09.003
10.1109/jsen.2020.3023656
10.2196/25079
10.1007/s11916-009-0009-x
10.1109/SAS.2017.7894053
ContentType Journal Article
Copyright Ajan Subramanian, Rui Cao, Emad Kasaeyan Naeini, Seyed Amir Hossein Aqajari, Thomas D Hughes, Michael-David Calderon, Kai Zheng, Nikil Dutt, Pasi Liljeberg, Sanna Salanterä, Ariana M Nelson, Amir M Rahmani. Originally published in JMIR Formative Research (https://formative.jmir.org), 27.01.2025.
2025. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Ajan Subramanian, Rui Cao, Emad Kasaeyan Naeini, Seyed Amir Hossein Aqajari, Thomas D Hughes, Michael-David Calderon, Kai Zheng, Nikil Dutt, Pasi Liljeberg, Sanna Salanterä, Ariana M Nelson, Amir M Rahmani. Originally published in JMIR Formative Research (https://formative.jmir.org), 27.01.2025. 2025
Copyright_xml – notice: Ajan Subramanian, Rui Cao, Emad Kasaeyan Naeini, Seyed Amir Hossein Aqajari, Thomas D Hughes, Michael-David Calderon, Kai Zheng, Nikil Dutt, Pasi Liljeberg, Sanna Salanterä, Ariana M Nelson, Amir M Rahmani. Originally published in JMIR Formative Research (https://formative.jmir.org), 27.01.2025.
– notice: 2025. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: Ajan Subramanian, Rui Cao, Emad Kasaeyan Naeini, Seyed Amir Hossein Aqajari, Thomas D Hughes, Michael-David Calderon, Kai Zheng, Nikil Dutt, Pasi Liljeberg, Sanna Salanterä, Ariana M Nelson, Amir M Rahmani. Originally published in JMIR Formative Research (https://formative.jmir.org), 27.01.2025. 2025
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7RV
7X7
7XB
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
KB0
M0S
NAPCQ
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
PRINS
7X8
5PM
DOA
DOI 10.2196/67969
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
Nursing & Allied Health Database
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
ProQuest Hospital Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One Community College
ProQuest Central Korea
Proquest Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
Nursing & Allied Health Database (Alumni Edition)
ProQuest Health & Medical Collection
Nursing & Allied Health Premium
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Central China
ProQuest Central
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
ProQuest Central (New)
ProQuest One Academic Eastern Edition
ProQuest Nursing & Allied Health Source
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
Nursing & Allied Health Premium
ProQuest Health & Medical Complete
ProQuest One Academic UKI Edition
ProQuest Nursing & Allied Health Source (Alumni)
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList Publicly Available Content Database
MEDLINE - Academic

MEDLINE
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: 7RV
  name: Nursing & Allied Health Database
  url: https://search.proquest.com/nahs
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
EISSN 2561-326X
ExternalDocumentID oai_doaj_org_article_69a0eda3b0db4d5881bfb320797965aa
PMC11811662
39869898
10_2196_67969
Genre Journal Article
GeographicLocations United States--US
California
GeographicLocations_xml – name: United States--US
– name: California
GroupedDBID 53G
7RV
7X7
8FI
8FJ
AAFWJ
AAYXX
ABUWG
ADBBV
AFFHD
AFKRA
AFPKN
ALMA_UNASSIGNED_HOLDINGS
AOIJS
ARCSS
BCNDV
BENPR
CCPQU
CITATION
FYUFA
GROUPED_DOAJ
HMCUK
HYE
M~E
NAPCQ
OK1
PGMZT
PHGZM
PHGZT
PIMPY
PPXIY
RPM
UKHRP
ALIPV
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PJZUB
PKEHL
PQEST
PQQKQ
PQUKI
PRINS
7X8
5PM
ID FETCH-LOGICAL-c458t-f99710aaf0f9709949196b39671fe16a05c7ec830098efc28a14dcec65dce3fa3
IEDL.DBID 7X7
ISICitedReferencesCount 2
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001447515800046&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2561-326X
IngestDate Mon Nov 10 04:35:04 EST 2025
Tue Nov 04 02:04:58 EST 2025
Sun Nov 09 13:25:33 EST 2025
Tue Oct 07 06:54:09 EDT 2025
Tue May 06 01:31:32 EDT 2025
Sat Nov 29 04:49:54 EST 2025
Tue Nov 18 22:36:19 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords machine learning approach
pain recognition
acute pain
signal processing
pain monitoring
pain assessment
electrodermal activity
pain intensity
multimodal machine learning–based framework
electromyogram
self-reported pain level
weak supervision
multimodal information fusion
machine learning–based framework
clinical pain management
electrocardiogram
pain measurement
pain intensity recognition
behavioral pain
health care
Language English
License Ajan Subramanian, Rui Cao, Emad Kasaeyan Naeini, Seyed Amir Hossein Aqajari, Thomas D Hughes, Michael-David Calderon, Kai Zheng, Nikil Dutt, Pasi Liljeberg, Sanna Salanterä, Ariana M Nelson, Amir M Rahmani. Originally published in JMIR Formative Research (https://formative.jmir.org), 27.01.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c458t-f99710aaf0f9709949196b39671fe16a05c7ec830098efc28a14dcec65dce3fa3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-1747-6980
0000-0003-4121-4948
0000-0003-0725-1155
0000-0002-3060-8119
0000-0001-9295-8299
0000-0002-7438-2641
0000-0003-1575-1635
0000-0003-3253-1300
0000-0002-9392-3589
0000-0001-6824-6945
0000-0003-2529-6699
0000-0002-0651-9394
OpenAccessLink https://www.proquest.com/docview/3164306554?pq-origsite=%requestingapplication%
PMID 39869898
PQID 3164306554
PQPubID 4997113
ParticipantIDs doaj_primary_oai_doaj_org_article_69a0eda3b0db4d5881bfb320797965aa
pubmedcentral_primary_oai_pubmedcentral_nih_gov_11811662
proquest_miscellaneous_3160461674
proquest_journals_3164306554
pubmed_primary_39869898
crossref_citationtrail_10_2196_67969
crossref_primary_10_2196_67969
PublicationCentury 2000
PublicationDate 20250127
PublicationDateYYYYMMDD 2025-01-27
PublicationDate_xml – month: 1
  year: 2025
  text: 20250127
  day: 27
PublicationDecade 2020
PublicationPlace Canada
PublicationPlace_xml – name: Canada
– name: Toronto
– name: Toronto, Canada
PublicationTitle JMIR formative research
PublicationTitleAlternate JMIR Form Res
PublicationYear 2025
Publisher JMIR Publications
Publisher_xml – name: JMIR Publications
References ref13
ref35
ref12
ref15
ref37
ref14
ref36
ref31
ref33
ref10
ref2
Merskey, HA (ref1) 1979; 6
ref17
ref16
ref38
ref19
ref18
Bonica, JJ (ref4) 1984
Ross, A (ref39) 2009
ref24
Le, QV (ref30) 2015
ref23
ref26
ref25
ref20
ref41
ref22
ref21
ref28
ref27
ref29
ref8
ref7
Kohavi, R (ref34) 1995
ref9
ref3
ref6
ref5
ref40
Staton, LJ (ref11) 2007; 99
Lemaître, G (ref32) 2017; 18
References_xml – ident: ref14
  doi: 10.1109/CYBConf.2013.6617456
– ident: ref22
  doi: 10.1145/3341105.3373945
– ident: ref7
  doi: 10.1002/bjs.11477
– ident: ref20
  doi: 10.2196/17783
– ident: ref25
  doi: 10.2196/25258
– ident: ref35
  doi: 10.1109/ICSMC.2005.1571679
– volume: 18
  start-page: 559
  issue: 1
  year: 2017
  ident: ref32
  publication-title: J Mach Learn Res
– ident: ref41
  doi: 10.1109/TAFFC.2017.2784832
– ident: ref26
  doi: 10.1109/tbme.2015.2474131
– ident: ref16
  doi: 10.1145/2388676.2388688
– ident: ref17
  doi: 10.1109/ICPR.2014.784
– ident: ref19
– ident: ref33
  doi: 10.14778/3157794.3157797
– ident: ref36
  doi: 10.48550/arXiv.1708.08755
– start-page: 1137
  year: 1995
  ident: ref34
  publication-title: Proc. 14th IJCAI
– ident: ref37
  doi: 10.1109/EMBC44109.2020.9175247
– volume: 6
  start-page: 249
  issue: 3
  year: 1979
  ident: ref1
  publication-title: Pain
– ident: ref9
  doi: 10.1016/1058-9139(92)90001-S
– ident: ref3
  doi: 10.1093/pm/pnx019
– ident: ref12
  doi: 10.1016/j.jpainsymman.2018.12.333
– ident: ref6
  doi: 10.1016/j.jpain.2015.12.008
– ident: ref27
  doi: 10.1109/EMBC46164.2021.9630002
– ident: ref40
  doi: 10.48550/arXiv.2112.08176
– volume: 99
  start-page: 532
  issue: 5
  year: 2007
  ident: ref11
  publication-title: J Natl Med Assoc
– ident: ref15
  doi: 10.5244/C.27.119
– ident: ref31
  doi: 10.1613/jair.953
– ident: ref8
  doi: 10.1016/j.pmn.2019.07.005
– ident: ref13
  doi: 10.1007/s00482-020-00501-w
– ident: ref28
  doi: 10.1109/imtc.2011.5944249
– year: 2009
  ident: ref39
  publication-title: Fusion, Feature-Level
– ident: ref21
– ident: ref24
  doi: 10.1016/j.procs.2021.03.021
– ident: ref5
  doi: 10.1097/j.pain.0000000000000160
– year: 1984
  ident: ref4
  publication-title: Pain in the Cancer Patient: Pathogenesis, Diagnosis and Therapy
– ident: ref10
  doi: 10.1016/s0005-7967(01)00072-9
– ident: ref29
  doi: 10.1016/j.neunet.2014.09.003
– ident: ref38
  doi: 10.1109/jsen.2020.3023656
– ident: ref23
  doi: 10.2196/25079
– ident: ref2
  doi: 10.1007/s11916-009-0009-x
– ident: ref18
  doi: 10.1109/SAS.2017.7894053
– start-page: 1
  year: 2015
  ident: ref30
  publication-title: Google Brain
SSID ssj0002116793
Score 2.2948399
Snippet Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels...
Background:Acute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels...
BackgroundAcute pain management is critical in postoperative care, especially in vulnerable patient populations that may be unable to self-report pain levels...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage e67969
SubjectTerms Adult
Aged
Behavior
Chronic pain
Data collection
Datasets
Dementia
Electrocardiography
Electroencephalography
Female
Heart rate
Humans
Machine Learning
Male
Medical research
Middle Aged
Minority & ethnic groups
Original Paper
Pain Measurement - methods
Pain, Postoperative - diagnosis
Patients
Physiology
Software
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3fS8MwED5EZAgi_rY6RwVfi22TNolvUxQf3Bii4FtJk1QFbYeb_v1e0q6sIvjiSyG9e0jukua75vIdwBk1TFMlKK40rSyptgm4DvPA0JwqbAkhHYnrHRuP-dOTmCyV-rI5YTU9cG2481TI0GhJ8lDnVCccYVaRkzhkgok0kQ4aYWMpmLLf4NgdL5AebNhcZ5xl5_Z_iehsPo6j_zdg-TM_cmnDudmCzQYp-sO6h9uwYsod6I2as_BdmLi7s--VRqUJxvf-_SIXqCp9bNoyvNXU1MzeqOH4U2cX_silTxq_YVZ99ocNrfgePN5cP1zdBk19hEDRhM-DQgjEB1IWYSEYIj0qcKA5ESmLChOlMkwUM4oTyxlqChVzGVEck0oTfJJCkn1YLavSHIIviYploWksEVJFSnOEiQiNuFI5kZwmHpwtDJephjzc1rB4yzCIsPbNnH09GLRq05ot46fCpbV6K7Tk1u4FujxrXJ795XIP-gufZc2Km2UE4z4MfxAdeXDainGt2AMQWZrq0-lYfvmUoc5B7eK2J0RwV0vTA95xfqerXUn5-uL4uO3d3ShN46P_GNwxrMe2xHAYBTHrw-r849OcwJr6mr_OPgZuln8DR0AEdQ
  priority: 102
  providerName: Directory of Open Access Journals
Title Multimodal Pain Recognition in Postoperative Patients: Machine Learning Approach
URI https://www.ncbi.nlm.nih.gov/pubmed/39869898
https://www.proquest.com/docview/3164306554
https://www.proquest.com/docview/3160461674
https://pubmed.ncbi.nlm.nih.gov/PMC11811662
https://doaj.org/article/69a0eda3b0db4d5881bfb320797965aa
Volume 9
WOSCitedRecordID wos001447515800046&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 2561-326X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002116793
  issn: 2561-326X
  databaseCode: DOA
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 2561-326X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002116793
  issn: 2561-326X
  databaseCode: M~E
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: Health & Medical Collection
  customDbUrl:
  eissn: 2561-326X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002116793
  issn: 2561-326X
  databaseCode: 7X7
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Nursing & Allied Health Database
  customDbUrl:
  eissn: 2561-326X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002116793
  issn: 2561-326X
  databaseCode: 7RV
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/nahs
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 2561-326X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002116793
  issn: 2561-326X
  databaseCode: BENPR
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 2561-326X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002116793
  issn: 2561-326X
  databaseCode: PIMPY
  dateStart: 20170101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3fb9MwED7BhqZJiN-DwIiCtNdoSezENi9oQ5tAolVUASpPkWM7oxIkpe34-7lz3UInxAsvkVyfIst3l3w9X74P4IQ7YblRHDPNGiLVdqm0WZs63nKDI6W0J3H9IMZjOZ2qOhTclqGtcvNM9A9qOxiqkZ8yxPWkcl7yN_MfKalG0elqkNC4Dfskm01xLqZiW2Mp_CEDO4C71PGMsXZKVRO18wryTP1_g5c3uyT_eO1c3v_fBT-AewFwJmfrCHkIt1z_CA5G4Uj9MdT-E9zvg0WjWs_6ZLJpKRr6BIek5jvM3ZogHC08DevydTLyXZguCQStV8lZYCd_Ap8uLz6-fZcGmYXU8FKu0k4phBlad1mnBAJGrnCnWqYqkXcur3RWGuGMZEQ96jpTSJ1z3BRTlXhlnWZHsNcPvXsGiWam0J3lhUZklhsrEW0iwpLGtExLXkZwstn5xgQOcpLC-NbgfxFyUOMdFEG8NZuvSTduGpyT27aTxJHtfxgWV01IuaZSOnNWszazLbelRIDetazIhMJblFpHcLxxXBMSd9n89loEr7bTmHJ0jqJ7N1x7G6KprwTaPF3HyHYlTEkvyRmB3ImenaXuzvSzr57Wmz4BzquqeP7vdb2Aw4I0iLM8LcQx7K0W1-4l3DE_V7PlIsYEmHyOfRr4q4xh__xiXE9iX23AUf1-VH_5BeylGBo
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V3bbtQwEB2VLSpIiFsLDZQSpPIYNbGdxEZCqFyqrrq7ilCRylNwbKesBMmyuy3ip_hGxs4FtkK89YGXSIlHlpMcj8f2-ByAPWZSzZRg2NO0sqTaJuA6LALDCqbwTgjpSFxH6WTCT09FtgY_u7MwNq2y84nOUeta2TXyfYpxvVU5j9mr2bfAqkbZ3dVOQqOBxbH58R2nbIuXw7f4f58Tcvju5M1R0KoKBIrFfBmUQuCoKmUZliLF-IgJBGFBRZJGpYkSGcYqNYpTy7RpSkW4jJhWRiUxXmkpKdZ7DdaZBfsA1rPhOPvYr-oQt61BN-CWzbHGivftOo1YGfScNsDfAtrLeZl_DHSHd_63T3QXbrchtX_Q9IF7sGaq-7AxbpMGNiFzh4y_1hqNMjmt_Pdd0lRd-Xhr9YrrmWko0NHCEc0uXvhjl2dq_JaC9sw_aPnXt-DDlbzPAxhUdWW2wZdUEVlqRiTGnpHSHONpjCG5UgWVnMUe7HV_Olcty7oV-_iS42zLAiJ3gPBgtzebNbQilw1eW5j0hZYF3D2o52d561TyRMjQaEmLUBdMxxynIGVBSZgKrCKW0oOdDih565oW-W-UePCsL0anYneKZGXqc2djifiTFG0eNpjsW0IFd6KjHvAVtK40dbWkmn52xOX2kHOUJOTRv9v1FG4cnYxH-Wg4OX4MN4lVXA6jgKQ7MFjOz80TuK4ultPFfLftfj58umo4_wKo23EJ
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1db9MwFL0aHaqQ0PiGwBhBGo9RE9tJbCSEBqOi2lpFCKTxFBzbGZUgKW0H4q_t13HtfEAnxNseeImU-MpykuPra_v6HIB9ZlLNlGDY07SypNom4DosAsMKpvBOCOlIXI_T2YyfnIhsC867szA2rbLzic5R61rZNfIRxbjeqpzHbFS2aRHZ4fjl4ltgFaTsTmsnp9FA5Mj8_IHTt9WLySH-62eEjN-8f_02aBUGAsVivg5KIXCElbIMS5FirMQEArKgIkmj0kSJDGOVGsWpZd00pSJcRkwro5IYr7SUFOu9AtsYkjMygO1sMs0-9is8xG1x0CFct_nWWPHIrtmIjQHQ6QT8Lbi9mKP5x6A3vvE_f66bsNOG2v5B0zduwZapbsNw2iYT3IHMHT7-Wms0yuS88t91yVR15eOt1TGuF6ahRkcLR0C7eu5PXf6p8Vtq2lP_oOVlvwsfLuV97sGgqivzAHxJFZGlZkRiTBopzTHOxtiSK1VQyVnswX7313PVsq9bEZAvOc7CLDhyBw4P9nqzRUM3ctHglYVMX2jZwd2Denmat84mT4QMjZa0CHXBdMxxalIWlISpwCpiKT3Y7UCTty5rlf9GjAdP-2J0NnYHSVamPnM2lqA_SdHmfoPPviVUcCdG6gHfQO5GUzdLqvlnR2huDz9HSUIe_rtdT2CIGM6PJ7OjR3CNWCHmMApIuguD9fLMPIar6vt6vlrutT3Rh0-XjeZfdnZ5yQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multimodal+Pain+Recognition+in+Postoperative+Patients%3A+Machine+Learning+Approach&rft.jtitle=JMIR+formative+research&rft.au=Subramanian%2C+Ajan&rft.au=Cao%2C+Rui&rft.au=Naeini%2C+Emad+Kasaeyan&rft.au=Seyed+Amir+Hossein+Aqajari&rft.date=2025-01-27&rft.pub=JMIR+Publications&rft.eissn=2561-326X&rft.volume=9&rft.spage=e67969&rft_id=info:doi/10.2196%2F67969&rft.externalDBID=HAS_PDF_LINK
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2561-326X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2561-326X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2561-326X&client=summon