Model for Determining the Psycho-Emotional State of a Person Based on Multimodal Data Analysis

The paper aims to develop an information system for human emotion recognition in streaming data obtained from a PC or smartphone camera, using different methods of modality merging (image, sound and text). The objects of research are the facial expressions, the emotional color of the tone of a conve...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Applied sciences Ročník 14; číslo 5; s. 1920
Hlavní autori: Shakhovska, Nataliya, Zherebetskyi, Oleh, Lupenko, Serhii
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Basel MDPI AG 01.03.2024
Predmet:
ISSN:2076-3417, 2076-3417
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract The paper aims to develop an information system for human emotion recognition in streaming data obtained from a PC or smartphone camera, using different methods of modality merging (image, sound and text). The objects of research are the facial expressions, the emotional color of the tone of a conversation and the text transmitted by a person. The paper proposes different neural network structures for emotion recognition based on unimodal flows and models for the margin of the multimodal data. The analysis determined that the best classification accuracy is obtained for systems with data fusion after processing each channel separately and obtaining individual characteristics. The final analysis of the model based on data from a camera and microphone or recording or broadcast of the screen, which were received in the “live” mode, gave a clear understanding that the quality of the obtained results is highly dependent on the quality of the data preparation and labeling. This is directly related to the fact that the data on which the neural network is trained is highly qualified. The neural network with combined data on the penultimate layer allows a psycho-emotional state recognition accuracy of 0.90 to be obtained. The spatial distribution of emotion analysis was also analyzed for each data modality. The model with late fusion of multimodal data demonstrated the best recognition accuracy.
AbstractList The paper aims to develop an information system for human emotion recognition in streaming data obtained from a PC or smartphone camera, using different methods of modality merging (image, sound and text). The objects of research are the facial expressions, the emotional color of the tone of a conversation and the text transmitted by a person. The paper proposes different neural network structures for emotion recognition based on unimodal flows and models for the margin of the multimodal data. The analysis determined that the best classification accuracy is obtained for systems with data fusion after processing each channel separately and obtaining individual characteristics. The final analysis of the model based on data from a camera and microphone or recording or broadcast of the screen, which were received in the “live” mode, gave a clear understanding that the quality of the obtained results is highly dependent on the quality of the data preparation and labeling. This is directly related to the fact that the data on which the neural network is trained is highly qualified. The neural network with combined data on the penultimate layer allows a psycho-emotional state recognition accuracy of 0.90 to be obtained. The spatial distribution of emotion analysis was also analyzed for each data modality. The model with late fusion of multimodal data demonstrated the best recognition accuracy.
Audience Academic
Author Shakhovska, Nataliya
Zherebetskyi, Oleh
Lupenko, Serhii
Author_xml – sequence: 1
  givenname: Nataliya
  orcidid: 0000-0002-6875-8534
  surname: Shakhovska
  fullname: Shakhovska, Nataliya
– sequence: 2
  givenname: Oleh
  surname: Zherebetskyi
  fullname: Zherebetskyi, Oleh
– sequence: 3
  givenname: Serhii
  orcidid: 0000-0002-6559-0721
  surname: Lupenko
  fullname: Lupenko, Serhii
BookMark eNptUV1LHDEUDWKhuvWpfyDgo4zmJpmPPK66bQWlQtvXhjuTZM0yM1mT7MP-e2O3gpTmPuRyOOfAueeUHM9htoR8BnYphGJXuN2CZDUozo7ICWdtUwkJ7fG7_SM5S2nDylMgOmAn5PdDMHakLkR6a7ONk5_9vKb5ydLHtB-eQrWaQvZhxpH-yJgtDY4ifbQxhZleY7KGluVhN2Y_BVNYt5iRLgt_n3z6RD44HJM9-_svyK8vq58336r771_vbpb31SCZyBWYvkYnzdAy7pyCjjvowXFsHOs7pZwEBFCWCSfq2gBTqmOq6V0vDO8RxILcHXxNwI3eRj9h3OuAXv8BQlxrjNkPo9VG8IZ3wKVomGxUoxRX6JDVTHWiLfCCnB-8tjE872zKehN2sQRKmqu6LppavLIuD6w1FlM_u5AjDmWMnfxQmnG-4Mu2a6SQbUm5IHAQDDGkFK3Tgy8HLactQj9qYPq1Rv2uxqK5-EfzFu1_7BdGK5y-
CitedBy_id crossref_primary_10_1080_08839514_2024_2440839
crossref_primary_10_1155_mse_1483523
Cites_doi 10.1016/j.egypro.2019.01.951
10.1016/j.neucom.2017.10.009
10.1016/j.ijepes.2021.107788
10.1016/j.ijcip.2021.100484
10.18653/v1/W18-3302
10.1016/j.cose.2021.102316
10.1016/j.epsr.2021.107732
10.3390/info12090342
10.1007/s00521-020-04748-3
10.18653/v1/D18-1014
10.1016/j.ijepes.2021.107790
10.1016/j.knosys.2018.07.041
10.1109/TAFFC.2020.2988455
10.1016/j.isatra.2021.11.033
10.1145/3410530.3414395
10.1049/iet-cvi.2015.0273
10.3390/app14020558
10.3389/fphys.2021.643202
10.18653/v1/2020.challengehml-1.1
10.1145/3432207
10.1016/j.imr.2016.03.004
10.1145/3219819.3219853
ContentType Journal Article
Copyright COPYRIGHT 2024 MDPI AG
2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: COPYRIGHT 2024 MDPI AG
– notice: 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID AAYXX
CITATION
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQQKQ
PQUKI
DOA
DOI 10.3390/app14051920
DatabaseName CrossRef
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials - QC
ProQuest Central
ProQuest One
ProQuest Central Korea
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
ProQuest Central
ProQuest One Academic Middle East (New)
ProQuest One Academic UKI Edition
ProQuest Central Essentials
ProQuest Central Korea
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
DatabaseTitleList
CrossRef

Publicly Available Content Database
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: PIMPY
  name: Publicly Available Content Database
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Sciences (General)
EISSN 2076-3417
ExternalDocumentID oai_doaj_org_article_d3262812436046969929afa050983724
A786434740
10_3390_app14051920
GroupedDBID .4S
2XV
5VS
7XC
8CJ
8FE
8FG
8FH
AADQD
AAFWJ
AAYXX
ADBBV
ADMLS
AFFHD
AFKRA
AFPKN
AFZYC
ALMA_UNASSIGNED_HOLDINGS
APEBS
ARCSS
BCNDV
BENPR
CCPQU
CITATION
CZ9
D1I
D1J
D1K
GROUPED_DOAJ
IAO
IGS
ITC
K6-
K6V
KC.
KQ8
L6V
LK5
LK8
M7R
MODMG
M~E
OK1
P62
PHGZM
PHGZT
PIMPY
PROAC
TUS
ABUWG
AZQEC
DWQXO
PKEHL
PQEST
PQQKQ
PQUKI
ID FETCH-LOGICAL-c403t-1db5af4dc702ff9182f1b1f2a6f0b899f41a119e03f355d10998096bfb3d2ba13
IEDL.DBID DOA
ISICitedReferencesCount 2
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001182861600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2076-3417
IngestDate Fri Oct 03 12:52:14 EDT 2025
Mon Jun 30 14:50:23 EDT 2025
Tue Nov 04 18:27:27 EST 2025
Tue Nov 18 20:41:48 EST 2025
Sat Nov 29 07:17:25 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 5
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c403t-1db5af4dc702ff9182f1b1f2a6f0b899f41a119e03f355d10998096bfb3d2ba13
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-6875-8534
0000-0002-6559-0721
OpenAccessLink https://doaj.org/article/d3262812436046969929afa050983724
PQID 2955469534
PQPubID 2032433
ParticipantIDs doaj_primary_oai_doaj_org_article_d3262812436046969929afa050983724
proquest_journals_2955469534
gale_infotracacademiconefile_A786434740
crossref_citationtrail_10_3390_app14051920
crossref_primary_10_3390_app14051920
PublicationCentury 2000
PublicationDate 2024-03-01
PublicationDateYYYYMMDD 2024-03-01
PublicationDate_xml – month: 03
  year: 2024
  text: 2024-03-01
  day: 01
PublicationDecade 2020
PublicationPlace Basel
PublicationPlace_xml – name: Basel
PublicationTitle Applied sciences
PublicationYear 2024
Publisher MDPI AG
Publisher_xml – name: MDPI AG
References Wang (ref_20) 2019; 33
Lim (ref_28) 2016; 5
ref_14
Torkamaan (ref_24) 2020; 4
ref_36
ref_13
ref_35
ref_12
ref_34
Zhang (ref_42) 2021; 12
ref_33
ref_32
ref_31
Liu (ref_11) 2019; 158
ref_30
Pedregosa (ref_43) 2011; 12
Jena (ref_6) 2021; 35
ref_19
ref_17
ref_38
ref_15
ref_37
Zadeh (ref_18) 2018; 32
Ding (ref_8) 2018; 275
Li (ref_7) 2021; 400
Tahoun (ref_4) 2021; 128
Wang (ref_41) 2020; 32
ref_23
Cao (ref_3) 2022; 137
ref_22
ref_44
Jena (ref_1) 2022; 205
ref_21
Hankin (ref_10) 2020; 52
ref_40
ref_29
Majumder (ref_16) 2018; 161
ref_27
ref_26
Kumar (ref_39) 2016; 10
Lima (ref_9) 2018; 51
Qin (ref_2) 2022; 141
Revina (ref_45) 2021; 33
Stellios (ref_5) 2021; 107
Dahmane (ref_25) 2020; 13
References_xml – volume: 158
  start-page: 2915
  year: 2019
  ident: ref_11
  article-title: A Reliability Assessment Method of Cyber Physical Distribution System
  publication-title: Energy Procedia
  doi: 10.1016/j.egypro.2019.01.951
– ident: ref_30
– volume: 275
  start-page: 1674
  year: 2018
  ident: ref_8
  article-title: A survey on security control and attack detection for industrial cyber-physical systems
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2017.10.009
– ident: ref_32
– ident: ref_26
– ident: ref_34
– volume: 137
  start-page: 107788
  year: 2022
  ident: ref_3
  article-title: Distributed synchronous detection for false data injection attack in cyber-physical microgrids
  publication-title: Int. J. Electr. Power Energy Syst.
  doi: 10.1016/j.ijepes.2021.107788
– volume: 35
  start-page: 100484
  year: 2021
  ident: ref_6
  article-title: Design of a coordinated cyber-physical attack in IoT based smart grid under limited intruder accessibility
  publication-title: Int. J. Crit. Infrastruct. Prot.
  doi: 10.1016/j.ijcip.2021.100484
– ident: ref_15
  doi: 10.18653/v1/W18-3302
– volume: 107
  start-page: 102316
  year: 2021
  ident: ref_5
  article-title: Assessing IoT enabled cyber-physical attack paths against critical systems
  publication-title: Comput. Secur.
  doi: 10.1016/j.cose.2021.102316
– volume: 205
  start-page: 107732
  year: 2022
  ident: ref_1
  article-title: Design of AC state estimation based cyber-physical attack for disrupting electricity market operation under limited sensor information
  publication-title: Electr. Power Syst. Res.
  doi: 10.1016/j.epsr.2021.107732
– ident: ref_12
  doi: 10.3390/info12090342
– volume: 32
  start-page: 15503
  year: 2020
  ident: ref_41
  article-title: A survey on face data augmentation for the training of deep neural networks
  publication-title: Neural Comput. Appl.
  doi: 10.1007/s00521-020-04748-3
– ident: ref_40
– ident: ref_19
  doi: 10.18653/v1/D18-1014
– ident: ref_37
– ident: ref_35
– ident: ref_44
– volume: 141
  start-page: 107790
  year: 2022
  ident: ref_2
  article-title: Formal modeling and analysis of cyber-physical cross-space attacks in power grid
  publication-title: Int. J. Electr. Power Energy Syst.
  doi: 10.1016/j.ijepes.2021.107790
– volume: 161
  start-page: 124
  year: 2018
  ident: ref_16
  article-title: Multimodal sentiment analysis using hierarchical fusion with context modeling
  publication-title: Knowl. Based Syst.
  doi: 10.1016/j.knosys.2018.07.041
– volume: 13
  start-page: 1044
  year: 2020
  ident: ref_25
  article-title: A Multimodal Non-Intrusive Stress Monitoring from the Pleasure-Arousal Emotional Dimensions
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2020.2988455
– volume: 32
  start-page: 1
  year: 2018
  ident: ref_18
  article-title: Memory Fusion Network for Multi-view Sequential Learning
  publication-title: Proc. AAAI Conf. Artif. Intell.
– volume: 128
  start-page: 294
  year: 2021
  ident: ref_4
  article-title: Secure control design for nonlinear cyber–physical systems under DoS, replay, and deception cyber-attacks with multiple transmission channels
  publication-title: ISA Trans.
  doi: 10.1016/j.isatra.2021.11.033
– ident: ref_23
  doi: 10.1145/3410530.3414395
– volume: 12
  start-page: 2825
  year: 2011
  ident: ref_43
  article-title: Scikit-learn: Machine Learning in Python
  publication-title: J. Mach. Learn. Res.
– volume: 400
  start-page: 126056
  year: 2021
  ident: ref_7
  article-title: Cyber attack estimation and detection for cyber-physical power systems
  publication-title: Appl. Math. Comput.
– ident: ref_31
– ident: ref_29
– ident: ref_33
– ident: ref_27
– volume: 10
  start-page: 567
  year: 2016
  ident: ref_39
  article-title: Extraction of informative regions of a face for facial expression recognition
  publication-title: IET Comput. Vis.
  doi: 10.1049/iet-cvi.2015.0273
– ident: ref_13
  doi: 10.3390/app14020558
– volume: 12
  start-page: 643202
  year: 2021
  ident: ref_42
  article-title: Pre-trained deep convolution neural network model with attention for speech emotion recognition
  publication-title: Front. Physiol.
  doi: 10.3389/fphys.2021.643202
– volume: 33
  start-page: 7216
  year: 2019
  ident: ref_20
  article-title: Words Can Shift: Dynamically Adjusting Word Representations Using Nonverbal Behaviors
  publication-title: Proc. AAAI Conf. Artif. Intell.
– ident: ref_21
  doi: 10.18653/v1/2020.challengehml-1.1
– volume: 4
  start-page: 155
  year: 2020
  ident: ref_24
  article-title: Mobile mood tracking: An investigation of concise and adaptive measurement instruments. Proceedings of the ACM on Interactive, Mobile
  publication-title: Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.
  doi: 10.1145/3432207
– volume: 52
  start-page: 102471
  year: 2020
  ident: ref_10
  article-title: Measuring cyber-physical security in industrial control systems via minimum-effort attack strategies
  publication-title: J. Inf. Secur. Appl.
– volume: 5
  start-page: 105
  year: 2016
  ident: ref_28
  article-title: Cultural differences in emotion: Differences in emotional arousal level between the East and the West
  publication-title: Integr. Med. Res.
  doi: 10.1016/j.imr.2016.03.004
– ident: ref_38
– ident: ref_17
– ident: ref_36
– ident: ref_22
– volume: 33
  start-page: 619
  year: 2021
  ident: ref_45
  article-title: A survey on human face expression recognition techniques
  publication-title: J. King Saud Univ. Comput. Inf. Sci.
– volume: 51
  start-page: 179
  year: 2018
  ident: ref_9
  article-title: Detectable and Undetectable Network Attack Security of Cyber-physical Systems
  publication-title: IFAC-Pap.
– ident: ref_14
  doi: 10.1145/3219819.3219853
SSID ssj0000913810
Score 2.2974064
Snippet The paper aims to develop an information system for human emotion recognition in streaming data obtained from a PC or smartphone camera, using different...
SourceID doaj
proquest
gale
crossref
SourceType Open Website
Aggregation Database
Enrichment Source
Index Database
StartPage 1920
SubjectTerms Accuracy
Algorithms
Artificial intelligence
convolution neural network
Datasets
emotional state
Emotions
Information systems
late fusion
multi-modal emotion recognition
multimodal data
Neural networks
SummonAdditionalLinks – databaseName: ProQuest Central
  dbid: BENPR
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LbxMxEB5BywEOhQYq0hbkQyUeksU-vBvvqUqaVBxQFCFAOWGN7TVCarNtkvL7mdl1Qg6FC7fV7mhfMx5_M_Z8A3AWKqcV1l6mhU4kM3JJ7ZWTNJ2kNit1nbR13N8-DaZTPZ9Xs5hwW8VtlRuf2Dpq3zjOkX_IKt5PVRW5Or-5ldw1ildXYwuNh7DPTGVk5_ujyXT2eZtlYdZLnSZdYV5O8T2vC1NMQbiFO3zvTEUtY__f_HI72Vw-_d_XfAYHEWaKYWcXh_CgXvTgyQ75YA8O47BeibeRe_rdc_jOvdGuBCFZMY4bZUhYEEoUnauUk67vD928xamiCQLFrMXtYkRTohd00Jb1XjeepMa4RrGhPnkBXy8nXy4-ytiCQTqV5GuZeltgUN4NkiyEioKRkNo0ZFiGxFKoFlSK9OPrJA8EXDwvs2kKimywuc8spvkR7C2aRf0SBMU6WJcOCaGiSizqEtFh4bUnUFrbog_vN9owLvKTc5uMK0NxCqvO7KiuD2db4ZuOluN-sRGrdSvCXNrtiWb5w8ShaTwh2IxxTl5ysqCsCDFiQCbGoeg9U314w0ZheMTTCzmMhQv0WcydZYYDTbBODRQ97nRjFCa6gpX5YxHH_758Ao8zQkzdBrdT2Fsv7-pX8Mj9Wv9cLV9Hy_4NEmIAfA
  priority: 102
  providerName: ProQuest
Title Model for Determining the Psycho-Emotional State of a Person Based on Multimodal Data Analysis
URI https://www.proquest.com/docview/2955469534
https://doaj.org/article/d3262812436046969929afa050983724
Volume 14
WOSCitedRecordID wos001182861600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: DOA
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: M~E
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: BENPR
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 2076-3417
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000913810
  issn: 2076-3417
  databaseCode: PIMPY
  dateStart: 20110101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Nb9QwEB2hwgEOFS0gFkrlQyU-pIg4cRL72KVbgQSrCAEqF6yxHUtIZRd1t_x-ZhxvlQOIC7ckGiWOPfa8J4_fAJxE47XCIRSy0WXBilyFDsoXFE6kq1o9lOkc95f33XKpLy5MPyn1xTlhozzw2HGvA-GLiqNQ3TKVaw3Fc4zIsiXEraqkBFp2ZkKm0hpsJEtXjQfyauL1vB9MXILwClf2noSgpNT_t_U4BZnz-7Cf0aE4HVt1ALeG1SHcm2gGHsJBno0b8SJLRr98AN-4pNmlIAAqznJ-CxkLAndiXOGKxViuh16e4KVYR4GiT3BbzCmSBUEX6TTuj3UgqzPcotgpljyEz-eLT2_eFrlyQuFVWW8LGVyDUQXflVWMhjhElE7GCttYOmJYUUmU0gxlHQlvBN4d08RlXHR1qBzK-hHsrdar4TEIoig4tB4JWKIqHeoW0WMTdCAsObhmBq92nWl9lhXn6haXlugF97yd9PwMTm6Mf45qGn82m_Oo3JiwBHZ6QI5hs2PYfznGDJ7zmFqeqNQgj_m8Af0WS17Z004TGlOdos8d7Ybd5hm8sZXh_D3T1OrJ_2jNU7hbERwas9eOYG97dT08gzv-1_b75uoYbs8Xy_7jcXJiuuvffei__gZpavFq
linkProvider Directory of Open Access Journals
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9QwEB2VLRLlALRQsVDAhyI-pIjYcRLngFDLtuqq29UeCioXjGPHCKlsyu4C4k_xG5lJnGUPwK0HblEyyufLzBvb8wZg1xdWSVO5iKcqjkiRK1JO2gjDCS9Fpqq4qeN-O8rHY3V2VkzW4GdXC0PLKjuf2DhqV1saI38hClpPVaSJfHXxJaKuUTS72rXQaGFxXP34jinb_OVwgN_3sRCHB6evj6LQVSCyMk4WEXdlarx0No-F9wXya89L7oXJfFxi9uElN5wXVZx4jMWOZo4U8vzSl4kTpeEJnvcKrEsCew_WJ8OTybvlqA6pbCoet4WASVLENA-NOQzyJOoovhL6mg4Bf4sDTXA7vPm_vZZbcCPQaLbX4n4T1qrpFlxfEVfcgs3gtubsadDWfnYb3lPvt3OGTJ0NwkIgNGbIglkbCqKDtq8Rnrzh4az2zLBJk5ewfQz5juFGU7b8uXZoNTALwzpplzvw5lKeeht603pa3QWGuZypMmuQgRsZl0ZlxliTOuWQdFdl2ofn3dfXNuivUxuQc415GEFFr0ClD7tL44tWduTPZvsEo6UJaYU3O-rZRx1cj3bI0AXxuCSjwZCsQEZsvCHhH5XkQvbhCYFQk0fDG7ImFGbgY5E2mN7LFdJWmUu83E4HQh1c3Vz_RuC9fx9-BNeOTk9GejQcH9-HDYHssF3MtwO9xexr9QCu2m-LT_PZw_BXMfhw2Yj9BZQ3Xag
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1NbxMxEB2VFCF6AFpABAr4UARUsrr2fsR7QKgljYhaoj0AKhdcr71GSG22JAHEX-PXMbPrDTkAtx64Rclov_J25o098wZgx-dWJaZyXKQq4qTIxZVLLMdwIkqZqSpq-rjfHw8mE3Vykhdr8LPrhaGyys4nNo7a1ZbWyPdkTvVUeRonez6URRTD0cuLL5wmSNFOazdOo4XIUfXjO6Zv8xfjIf7XT6QcHb599ZqHCQPcJlG84MKVqfGJs4NIep8j1_aiFF6azEclZiI-EUaIvIpij3HZ0S6SQs5f-jJ2sjQixuNegXWk5InswXoxflN8WK7wkOKmElHbFBjHeUR70pjPIGei6eIrYbCZFvC3mNAEutHN__kR3YIbgV6z_fZ92IS1aroFGyuii1uwGdzZnD0LmtvPb8NHmgl3xpDBs2EoEEJjhuyYtSGCH7bzjvDgDT9ntWeGFU2-wg6QCjiGH5p25vPaodXQLAzrJF_uwLtLueu70JvW0-oeMMzxTJVZg8zcJFFpVGaMNalTDsl4VaZ92O2QoG3QZafxIGca8zOCjV6BTR92lsYXrRzJn80OCFJLE9IQb76oZ590cEnaIXOXxO_ijBZJshyZsvGGBIFUPJBJH54SIDV5Orwga0LDBt4WaYbp_YFCOovQx9Ntd4DUwQXO9W803v_3z4_hGsJUH48nRw_gukTS2Nb4bUNvMftaPYSr9tvi83z2KLxgDE4vG7C_AOqsZmg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Model+for+Determining+the+Psycho-Emotional+State+of+a+Person+Based+on+Multimodal+Data+Analysis&rft.jtitle=Applied+sciences&rft.au=Shakhovska%2C+Nataliya&rft.au=Zherebetskyi%2C+Oleh&rft.au=Lupenko%2C+Serhii&rft.date=2024-03-01&rft.issn=2076-3417&rft.eissn=2076-3417&rft.volume=14&rft.issue=5&rft.spage=1920&rft_id=info:doi/10.3390%2Fapp14051920&rft.externalDBID=n%2Fa&rft.externalDocID=10_3390_app14051920
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2076-3417&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2076-3417&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2076-3417&client=summon