VI-Net—View-Invariant Quality of Human Movement Assessment

We propose a view-invariant method towards the assessment of the quality of human movements which does not rely on skeleton data. Our end-to-end convolutional neural network consists of two stages, where at first a view-invariant trajectory descriptor for each body joint is generated from RGB images...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Sensors (Basel, Switzerland) Ročník 20; číslo 18; s. 5258
Hlavní autoři: Sardari, Faegheh, Paiement, Adeline, Hannuna, Sion, Mirmehdi, Majid
Médium: Journal Article
Jazyk:angličtina
Vydáno: Switzerland MDPI 15.09.2020
MDPI AG
Témata:
ISSN:1424-8220, 1424-8220
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract We propose a view-invariant method towards the assessment of the quality of human movements which does not rely on skeleton data. Our end-to-end convolutional neural network consists of two stages, where at first a view-invariant trajectory descriptor for each body joint is generated from RGB images, and then the collection of trajectories for all joints are processed by an adapted, pre-trained 2D convolutional neural network (CNN) (e.g., VGG-19 or ResNeXt-50) to learn the relationship amongst the different body parts and deliver a score for the movement quality. We release the only publicly-available, multi-view, non-skeleton, non-mocap, rehabilitation movement dataset (QMAR), and provide results for both cross-subject and cross-view scenarios on this dataset. We show that VI-Net achieves average rank correlation of 0.66 on cross-subject and 0.65 on unseen views when trained on only two views. We also evaluate the proposed method on the single-view rehabilitation dataset KIMORE and obtain 0.66 rank correlation against a baseline of 0.62.
AbstractList We propose a view-invariant method towards the assessment of the quality of human movements which does not rely on skeleton data. Our end-to-end convolutional neural network consists of two stages, where at first a view-invariant trajectory descriptor for each body joint is generated from RGB images, and then the collection of trajectories for all joints are processed by an adapted, pre-trained 2D convolutional neural network (CNN) (e.g., VGG-19 or ResNeXt-50) to learn the relationship amongst the different body parts and deliver a score for the movement quality. We release the only publicly-available, multi-view, non-skeleton, non-mocap, rehabilitation movement dataset (QMAR), and provide results for both cross-subject and cross-view scenarios on this dataset. We show that VI-Net achieves average rank correlation of 0.66 on cross-subject and 0.65 on unseen views when trained on only two views. We also evaluate the proposed method on the single-view rehabilitation dataset KIMORE and obtain 0.66 rank correlation against a baseline of 0.62.We propose a view-invariant method towards the assessment of the quality of human movements which does not rely on skeleton data. Our end-to-end convolutional neural network consists of two stages, where at first a view-invariant trajectory descriptor for each body joint is generated from RGB images, and then the collection of trajectories for all joints are processed by an adapted, pre-trained 2D convolutional neural network (CNN) (e.g., VGG-19 or ResNeXt-50) to learn the relationship amongst the different body parts and deliver a score for the movement quality. We release the only publicly-available, multi-view, non-skeleton, non-mocap, rehabilitation movement dataset (QMAR), and provide results for both cross-subject and cross-view scenarios on this dataset. We show that VI-Net achieves average rank correlation of 0.66 on cross-subject and 0.65 on unseen views when trained on only two views. We also evaluate the proposed method on the single-view rehabilitation dataset KIMORE and obtain 0.66 rank correlation against a baseline of 0.62.
We propose a view-invariant method towards the assessment of the quality of human movements which does not rely on skeleton data. Our end-to-end convolutional neural network consists of two stages, where at first a view-invariant trajectory descriptor for each body joint is generated from RGB images, and then the collection of trajectories for all joints are processed by an adapted, pre-trained 2D convolutional neural network (CNN) (e.g., VGG-19 or ResNeXt-50) to learn the relationship amongst the different body parts and deliver a score for the movement quality. We release the only publicly-available, multi-view, non-skeleton, non-mocap, rehabilitation movement dataset (QMAR), and provide results for both cross-subject and cross-view scenarios on this dataset. We show that VI-Net achieves average rank correlation of 0.66 on cross-subject and 0.65 on unseen views when trained on only two views. We also evaluate the proposed method on the single-view rehabilitation dataset KIMORE and obtain 0.66 rank correlation against a baseline of 0.62.
We propose a view-invariant method towards the assessment of the quality of human 1 movements which does not rely on skeleton data. Our end-to-end convolutional neural network 2 consists of two stages, where at first a view-invariant trajectory descriptor for each body joint is 3 generated from RGB images, and then the collection of trajectories for all joints are processed by an 4 adapted, pre-trained 2D CNN (e.g. VGG-19 or ResNeXt-50) to learn the relationship amongst 5 the different body parts and deliver a score for the movement quality. We release the only 6 publicly-available, multi-view, non-skeleton, non-mocap, rehabilitation movement dataset (QMAR), 7 and provide results for both cross-subject and cross-view scenarios on this dataset. We show that 8 VI-Net achieves average rank correlation of 0.66 on cross-subject and 0.65 on unseen views when 9 trained on only two views. We also evaluate the proposed method on the single-view rehabilitation 10 dataset KIMORE and obtain 0.66 rank correlation against a baseline of 0.62.
Author Hannuna, Sion
Sardari, Faegheh
Mirmehdi, Majid
Paiement, Adeline
AuthorAffiliation 2 Université de Toulon, Aix Marseille Univ, CNRS, LIS, Marseille, France; adeline.paiement@univ-tln.fr
1 Department of Computer Science, University of Bristol, Bristol BS8 1UB, UK; sh1670@bristol.ac.uk (S.H.); m.mirmehdi@bristol.ac.uk (M.M.)
AuthorAffiliation_xml – name: 1 Department of Computer Science, University of Bristol, Bristol BS8 1UB, UK; sh1670@bristol.ac.uk (S.H.); m.mirmehdi@bristol.ac.uk (M.M.)
– name: 2 Université de Toulon, Aix Marseille Univ, CNRS, LIS, Marseille, France; adeline.paiement@univ-tln.fr
Author_xml – sequence: 1
  givenname: Faegheh
  orcidid: 0000-0002-9134-0427
  surname: Sardari
  fullname: Sardari, Faegheh
– sequence: 2
  givenname: Adeline
  orcidid: 0000-0001-5114-1514
  surname: Paiement
  fullname: Paiement, Adeline
– sequence: 3
  givenname: Sion
  surname: Hannuna
  fullname: Hannuna, Sion
– sequence: 4
  givenname: Majid
  orcidid: 0000-0002-6478-1403
  surname: Mirmehdi
  fullname: Mirmehdi, Majid
BackLink https://www.ncbi.nlm.nih.gov/pubmed/32942561$$D View this record in MEDLINE/PubMed
https://hal.science/hal-02934456$$DView record in HAL
BookMark eNptkstu1DAUhi1URC-w4AXQLOkire9xJIQ0qqAz0lCEBN1atnPcukriYieDuuMheEKepEmnTC9i5aNz-f7zy2cf7XSxA4TeEnzEWIWPM8VECSrUC7RHOOWFohTvPIp30X7OVxhTxph6hXYZrTgVkuyhD-fL4gz6v7__nAf4VSy7tUnBdP3s22Ca0N_Mop8thtZ0sy9xDS2MlXnOkPMUvkYvvWkyvLl_D9CPz5--nyyK1dfT5cl8VTjBSV84UhHiK8lsxZln0mDmKSudMhRKC5QDlZIwzjGl3FkB3koJVe0U567Enh2g5YZbR3Olr1NoTbrR0QR9l4jpQpvUB9eAZrW1qvbOSsy4ErYSQtBSUeusEjWQkfVxw7oebAu1G20k0zyBPq104VJfxLUuRYlLLEfA4QZw-WxsMV_pKYdpNXoRcj2Jvb8XS_HnALnXbcgOmsZ0EIesKeecKcHZhH33eK8t-d9XPei6FHNO4LctBOvpDPT2DMbe42e9LvSmD3GyFJr_TNwCNlqycg
CitedBy_id crossref_primary_10_1016_j_bspc_2021_103323
crossref_primary_10_1016_j_jvcir_2022_103625
crossref_primary_10_3390_signals2030037
crossref_primary_10_1007_s40747_022_00892_6
crossref_primary_10_57197_JDR_2023_0065
crossref_primary_10_1109_ACCESS_2025_3545787
crossref_primary_10_1016_j_bspc_2025_108347
crossref_primary_10_1109_TNSRE_2022_3219085
crossref_primary_10_1038_s41597_023_02663_5
crossref_primary_10_1080_24751839_2025_2454053
Cites_doi 10.1109/CVPRW.2017.16
10.1109/ICCV.2015.510
10.1109/ICCV.2017.89
10.5244/C.28.79
10.1007/978-3-030-00767-6_12
10.1109/ICCV.2019.00234
10.1016/j.cviu.2015.11.016
10.1109/CVPR.2017.502
10.1109/CVPR.2019.00953
10.1109/CVPR.2019.00117
10.1109/CVPR42600.2020.00986
10.1007/978-3-319-10599-4_36
10.1016/j.artmed.2018.12.007
10.1007/978-3-030-30645-8_22
10.1109/TNSRE.2019.2923060
10.1007/978-3-030-01240-3_28
10.1016/j.patcog.2017.02.030
10.1109/CVPR.2017.634
10.1109/TPAMI.2019.2896631
10.1109/CVPR.2018.00685
10.1109/CVPR.2019.00805
10.1109/WACV.2019.00150
10.1109/CVPR.2018.00127
10.3390/data3010002
10.3390/s19194129
10.1109/CVPR.2017.486
10.1109/CVPR.2019.00039
10.1002/rcs.1850
10.1109/TNSRE.2020.2966249
10.1109/CVPR.2019.00354
10.1109/CVPR42600.2020.00530
10.1109/CVPR.2017.143
10.1109/TPAMI.2017.2704624
10.1109/CVPR42600.2020.00608
10.1109/ICCVW.2015.49
10.1109/ICCV.2019.00444
10.1109/ICIP.2018.8451364
10.1109/CVPR.2019.00797
10.1109/TPAMI.2017.2691768
ContentType Journal Article
Copyright Distributed under a Creative Commons Attribution 4.0 International License
2020 by the authors. 2020
Copyright_xml – notice: Distributed under a Creative Commons Attribution 4.0 International License
– notice: 2020 by the authors. 2020
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
1XC
VOOES
5PM
DOA
DOI 10.3390/s20185258
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
Hyper Article en Ligne (HAL)
Hyper Article en Ligne (HAL) (Open Access)
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
MEDLINE



CrossRef
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_3dbb8dfcb603485b95552782bcb85de1
PMC7570706
oai:HAL:hal-02934456v1
32942561
10_3390_s20185258
Genre Journal Article
GrantInformation_xml – fundername: Engineering and Physical Sciences Research Council
  grantid: EP/R005273/1
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFFHD
AFKRA
AFZYC
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PJZUB
PPXIY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
3V.
ABJCF
ALIPV
ARAPS
CGR
CUY
CVF
ECM
EIF
HCIFZ
KB.
M7S
NPM
PDBOC
7X8
PUEGO
1XC
ADRAZ
IAO
IPNFZ
ITC
RIG
VOOES
5PM
ID FETCH-LOGICAL-c541t-c1911f963b943f36a03f237c8a2e7be24e26613440224cb5efb66e9dc844c70f3
IEDL.DBID DOA
ISICitedReferencesCount 18
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000583351200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1424-8220
IngestDate Tue Oct 14 19:05:55 EDT 2025
Tue Nov 04 01:59:24 EST 2025
Tue Oct 14 20:41:27 EDT 2025
Wed Oct 01 13:15:16 EDT 2025
Wed Feb 19 02:28:21 EST 2025
Sat Nov 29 07:20:46 EST 2025
Tue Nov 18 22:38:32 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 18
Keywords health monitoring
view-invariant convolutional neural network (CNN)
movement analysis
Language English
License Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c541t-c1911f963b943f36a03f237c8a2e7be24e26613440224cb5efb66e9dc844c70f3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0002-6478-1403
0000-0002-9134-0427
0000-0001-5114-1514
OpenAccessLink https://doaj.org/article/3dbb8dfcb603485b95552782bcb85de1
PMID 32942561
PQID 2444385436
PQPubID 23479
ParticipantIDs doaj_primary_oai_doaj_org_article_3dbb8dfcb603485b95552782bcb85de1
pubmedcentral_primary_oai_pubmedcentral_nih_gov_7570706
hal_primary_oai_HAL_hal_02934456v1
proquest_miscellaneous_2444385436
pubmed_primary_32942561
crossref_primary_10_3390_s20185258
crossref_citationtrail_10_3390_s20185258
PublicationCentury 2000
PublicationDate 20200915
PublicationDateYYYYMMDD 2020-09-15
PublicationDate_xml – month: 9
  year: 2020
  text: 20200915
  day: 15
PublicationDecade 2020
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
PublicationTitle Sensors (Basel, Switzerland)
PublicationTitleAlternate Sensors (Basel)
PublicationYear 2020
Publisher MDPI
MDPI AG
Publisher_xml – name: MDPI
– name: MDPI AG
References ref_50
ref_14
Zhang (ref_44) 2019; 41
ref_13
ref_12
ref_11
ref_52
ref_51
ref_19
ref_18
ref_17
ref_16
ref_15
Rahmani (ref_38) 2018; 40
Tao (ref_36) 2016; 148
ref_25
ref_24
ref_23
ref_22
ref_21
ref_20
ref_29
Elkholy (ref_37) 2019; 24
ref_27
ref_26
Fard (ref_5) 2018; 14
ref_35
ref_34
ref_33
ref_32
ref_31
ref_39
Khokhlova (ref_10) 2019; 94
Liu (ref_43) 2017; 68
ref_47
ref_46
ref_45
ref_42
ref_41
ref_40
ref_1
Liao (ref_8) 2019; 28
ref_3
ref_2
ref_49
ref_48
ref_9
Li (ref_28) 2017; 40
Capecci (ref_30) 2019; 27
ref_4
ref_7
ref_6
References_xml – ident: ref_2
  doi: 10.1109/CVPRW.2017.16
– ident: ref_34
  doi: 10.1109/ICCV.2015.510
– ident: ref_50
  doi: 10.1109/ICCV.2017.89
– ident: ref_26
– ident: ref_47
  doi: 10.5244/C.28.79
– ident: ref_1
  doi: 10.1007/978-3-030-00767-6_12
– ident: ref_17
  doi: 10.1109/ICCV.2019.00234
– ident: ref_16
– volume: 148
  start-page: 136
  year: 2016
  ident: ref_36
  article-title: A Comparative Study of Pose Representation and Dynamics Modelling for Online Motion Quality Assessment
  publication-title: Comput. Vis. Image Underst.
  doi: 10.1016/j.cviu.2015.11.016
– ident: ref_39
– ident: ref_35
  doi: 10.1109/CVPR.2017.502
– ident: ref_51
  doi: 10.1109/CVPR.2019.00953
– ident: ref_23
– ident: ref_49
  doi: 10.1109/CVPR.2019.00117
– ident: ref_33
  doi: 10.1109/CVPR42600.2020.00986
– ident: ref_31
  doi: 10.1007/978-3-319-10599-4_36
– volume: 94
  start-page: 54
  year: 2019
  ident: ref_10
  article-title: Normal and Pathological Gait Classification LSTM Model
  publication-title: Artif. Intell. Med.
  doi: 10.1016/j.artmed.2018.12.007
– ident: ref_7
  doi: 10.1007/978-3-030-30645-8_22
– ident: ref_4
– ident: ref_27
– volume: 27
  start-page: 1436
  year: 2019
  ident: ref_30
  article-title: The KIMORE Dataset: Kinematic Assessment of Movement and Clinical Scores for Remote Monitoring of Physical Rehabilitation
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng.
  doi: 10.1109/TNSRE.2019.2923060
– ident: ref_52
– ident: ref_25
  doi: 10.1007/978-3-030-01240-3_28
– volume: 68
  start-page: 346
  year: 2017
  ident: ref_43
  article-title: Enhanced Skeleton Visualization for View Invariant Human Action Recognition
  publication-title: Pattern Recog.
  doi: 10.1016/j.patcog.2017.02.030
– ident: ref_24
  doi: 10.1109/CVPR.2017.634
– ident: ref_41
– volume: 41
  start-page: 1963
  year: 2019
  ident: ref_44
  article-title: View Adaptive Neural Networks for High Performance Skeleton-Based Human Action Recognition
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2019.2896631
– ident: ref_40
  doi: 10.1109/CVPR.2018.00685
– ident: ref_6
  doi: 10.1109/CVPR.2019.00805
– ident: ref_12
  doi: 10.1109/WACV.2019.00150
– ident: ref_45
  doi: 10.1109/CVPR.2018.00127
– ident: ref_48
  doi: 10.3390/data3010002
– ident: ref_11
– ident: ref_13
  doi: 10.3390/s19194129
– ident: ref_42
  doi: 10.1109/CVPR.2017.486
– ident: ref_3
  doi: 10.1109/CVPR.2019.00039
– volume: 14
  start-page: 1850
  year: 2018
  ident: ref_5
  article-title: Automated Robot-Assisted Surgical Skill Evaluation: Predictive Analytics Approach
  publication-title: Int. J. Med. Robot. Comput. Assist. Surg.
  doi: 10.1002/rcs.1850
– volume: 28
  start-page: 468
  year: 2019
  ident: ref_8
  article-title: A Deep Learning Framework for Assessing Physical Rehabilitation Exercises
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng.
  doi: 10.1109/TNSRE.2020.2966249
– ident: ref_15
  doi: 10.1109/CVPR.2019.00354
– ident: ref_18
  doi: 10.1109/CVPR42600.2020.00530
– ident: ref_21
  doi: 10.1109/CVPR.2017.143
– ident: ref_29
– volume: 40
  start-page: 1114
  year: 2017
  ident: ref_28
  article-title: Domain Generalization and Adaptation Using Low Rank Exemplar SVMs
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2017.2704624
– volume: 24
  start-page: 208
  year: 2019
  ident: ref_37
  article-title: Efficient and Robust Skeleton-Based Quality Assessment and Abnormality Detection in Human Action Performance
  publication-title: IEEE J. Biomed. Health Inform.
– ident: ref_46
– ident: ref_20
  doi: 10.1109/CVPR42600.2020.00608
– ident: ref_9
  doi: 10.1109/ICCVW.2015.49
– ident: ref_19
  doi: 10.1109/ICCV.2019.00444
– ident: ref_22
– ident: ref_32
  doi: 10.1109/ICIP.2018.8451364
– ident: ref_14
  doi: 10.1109/CVPR.2019.00797
– volume: 40
  start-page: 667
  year: 2018
  ident: ref_38
  article-title: Learning a Deep Model for Human Action Recognition from Novel Viewpoints
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2017.2691768
SSID ssj0023338
Score 2.4614785
Snippet We propose a view-invariant method towards the assessment of the quality of human movements which does not rely on skeleton data. Our end-to-end convolutional...
We propose a view-invariant method towards the assessment of the quality of human 1 movements which does not rely on skeleton data. Our end-to-end...
SourceID doaj
pubmedcentral
hal
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 5258
SubjectTerms Artificial Intelligence
Computer Science
Computer Vision and Pattern Recognition
health monitoring
Humans
Image Processing
Machine Learning
Movement
movement analysis
Neural and Evolutionary Computing
Neural Networks, Computer
Signal and Image Processing
view-invariant convolutional neural network (CNN)
Title VI-Net—View-Invariant Quality of Human Movement Assessment
URI https://www.ncbi.nlm.nih.gov/pubmed/32942561
https://www.proquest.com/docview/2444385436
https://hal.science/hal-02934456
https://pubmed.ncbi.nlm.nih.gov/PMC7570706
https://doaj.org/article/3dbb8dfcb603485b95552782bcb85de1
Volume 20
WOSCitedRecordID wos000583351200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: DOA
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: M~E
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: BENPR
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Proquest Health and Medical Complete
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: 7X7
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: PIMPY
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1La9tAEB7atIfmEJo-nbRGLT30IrLWPgW5OMUhhtqY0gb3JLSrXWIocokVh15KfkR-YX5JZleysJNAL73sYTX70Ix2Z77V8g3AJ0VUwXuaxdxwHTNHaZz70ybOBDZJmGAFCckm5HisptN0spbqy98Jq-mBa8Ud0EJrVTijBaFMcZ1yzxmmEm204oUNwIfIdAWmGqhFEXnVPEIUQf3BAt2c4onP677mfQJJP_qUM38F8n58efea5JrfOX4OO03AGPXrie7CI1u-gO01GsGXcHg6jMe2urm6Pp3Zy3hYLhEAo8aimiDjTzR3UTisj0bzQA9eRf2WkPMV_DgefP9yEjdZEWLDWa-KDSKsnsN1o1NGHRU5oS6h0qg8sdKr13qfSxnz3tlobp0WwqaFUYwZSRx9DVvlvLRvIZIawzvCtaWpYTlCKemo1LkjzHDs1Xbg80pbmWkow33mil8ZQgev2KxVbAc-tqK_a56Mh4SOvMpbAU9tHSrQ4Flj8OxfBseR0GAbfZz0v2a-jmDswjAiXKLQh5U9M1wv_idIXtr5xSLDcIZRxRkVHXhT27ftiyYpbmECW8sNy28MtvmknJ0FTm7JJW6eYu9_vOE-PEs8qveJKvg72KrOL-x7eGqW1Wxx3oXHcipDqbrw5Ggwnnzrho8fy9HfAdZNhqPJz1sdCwlK
linkProvider Directory of Open Access Journals
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=VI-Net%E2%80%94View-Invariant+Quality+of+Human+Movement+Assessment&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Sardari%2C+Faegheh&rft.au=Paiement%2C+Adeline&rft.au=Hannuna%2C+Sion&rft.au=Mirmehdi%2C+Majid&rft.date=2020-09-15&rft.pub=MDPI&rft.eissn=1424-8220&rft.volume=20&rft.issue=18&rft_id=info:doi/10.3390%2Fs20185258&rft_id=info%3Apmid%2F32942561&rft.externalDocID=PMC7570706
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon