Multi-View Large Population Gait Database With Human Meshes and Its Performance Evaluation

Existing model-based gait databases provide the 2D poses (i.e., joint locations) extracted by general pose estimators as the human model. However, these 2D poses suffer from information loss and are of relatively low quality. In this paper, we consider a more informative 3D human mesh model with par...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on biometrics, behavior, and identity science Ročník 4; číslo 2; s. 234 - 248
Hlavní autoři: Li, Xiang, Makihara, Yasushi, Xu, Chi, Yagi, Yasushi
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 01.04.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2637-6407, 2637-6407
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Existing model-based gait databases provide the 2D poses (i.e., joint locations) extracted by general pose estimators as the human model. However, these 2D poses suffer from information loss and are of relatively low quality. In this paper, we consider a more informative 3D human mesh model with parametric pose and shape features, and propose a multi-view training framework for accurate mesh estimation. Unlike existing methods, which estimate a mesh from a single view and suffer from the ill-posed estimation problem in 3D space, the proposed framework takes asynchronous multi-view gait sequences as input and uses both multi-view and single-view streams to learn consistent and accurate mesh models for both multi-view and single-view sequences. After applying the proposed framework to the existing OU-MVLP database, we establish a large-scale gait database with human meshes (i.e., OUMVLP-Mesh), containing over 10,000 subjects and up to 14 view angles. Experimental results show that the proposed framework estimates human mesh models more accurately than similar methods, providing models of sufficient quality to improve the recognition performance of a baseline model-based gait recognition approach.
AbstractList Existing model-based gait databases provide the 2D poses (i.e., joint locations) extracted by general pose estimators as the human model. However, these 2D poses suffer from information loss and are of relatively low quality. In this paper, we consider a more informative 3D human mesh model with parametric pose and shape features, and propose a multi-view training framework for accurate mesh estimation. Unlike existing methods, which estimate a mesh from a single view and suffer from the ill-posed estimation problem in 3D space, the proposed framework takes asynchronous multi-view gait sequences as input and uses both multi-view and single-view streams to learn consistent and accurate mesh models for both multi-view and single-view sequences. After applying the proposed framework to the existing OU-MVLP database, we establish a large-scale gait database with human meshes (i.e., OUMVLP-Mesh), containing over 10,000 subjects and up to 14 view angles. Experimental results show that the proposed framework estimates human mesh models more accurately than similar methods, providing models of sufficient quality to improve the recognition performance of a baseline model-based gait recognition approach.
Author Makihara, Yasushi
Yagi, Yasushi
Li, Xiang
Xu, Chi
Author_xml – sequence: 1
  givenname: Xiang
  orcidid: 0000-0002-8044-7050
  surname: Li
  fullname: Li, Xiang
  email: li@am.sanken.osaka-u.ac.jp
  organization: Department of Intelligent Media, SANKEN, Osaka University, Osaka, Japan
– sequence: 2
  givenname: Yasushi
  surname: Makihara
  fullname: Makihara, Yasushi
  email: makihara@am.sanken.osaka-u.ac.jp
  organization: Department of Intelligent Media, SANKEN, Osaka University, Osaka, Japan
– sequence: 3
  givenname: Chi
  orcidid: 0000-0001-6036-5763
  surname: Xu
  fullname: Xu, Chi
  email: xu@am.sanken.osaka-u.ac.jp
  organization: Department of Intelligent Media, SANKEN, Osaka University, Osaka, Japan
– sequence: 4
  givenname: Yasushi
  orcidid: 0000-0002-3546-8071
  surname: Yagi
  fullname: Yagi, Yasushi
  email: yagi@am.sanken.osaka-u.ac.jp
  organization: Department of Intelligent Media, SANKEN, Osaka University, Osaka, Japan
BookMark eNp9kE1Lw0AQhhdRsNb-Ab0seE7djySbPWqtbaGlPVQFL2GaTuyWNKm7G8V_b_qBiAdhYAbmfWZe3gtyWlYlEnLFWZdzpm_n96PppCuYEF3JVRhF-oS0RCxVEIdMnf6az0nHuTVjTLBQN9Uir5O68CZ4NvhJx2DfkM6qbV2AN1VJB2A8fQAPC3BIX4xf0WG9gZJO0K3QUSiXdOQdnaHNK9ssMqT9DyjqPX5JznIoHHaOvU2eHvvz3jAYTwej3t04yKTUPgCZKMkEiAjiHDnGC954i7IYOaBWcpGHTKssjLJltlRSRFECC7VkUsa5YqBlm9wc7m5t9V6j8-m6qm3ZvExFrLTUCY-TRpUcVJmtnLOYp5nxe5_egilSztJdmOk-zHQXZnoMs0HFH3RrzQbs1__Q9QEyiPgDaKWkDLX8BsDZgUo
CitedBy_id crossref_primary_10_3390_s25113471
crossref_primary_10_1007_s00521_025_11505_x
crossref_primary_10_1109_TPAMI_2025_3577594
crossref_primary_10_1109_TPAMI_2023_3312419
crossref_primary_10_1007_s11227_024_06172_z
crossref_primary_10_1109_TIFS_2023_3236181
crossref_primary_10_1109_ACCESS_2025_3545787
crossref_primary_10_1109_TIFS_2024_3428371
crossref_primary_10_1007_s11227_023_05143_0
crossref_primary_10_1109_ACCESS_2025_3542837
crossref_primary_10_1109_ACCESS_2025_3570280
crossref_primary_10_1109_TMM_2023_3312931
crossref_primary_10_1109_TCSVT_2025_3545210
crossref_primary_10_1109_TPAMI_2025_3546482
crossref_primary_10_1007_s11227_024_06089_7
crossref_primary_10_1109_TIP_2024_3360870
crossref_primary_10_1109_TNNLS_2025_3526815
crossref_primary_10_3390_jimaging10040088
Cites_doi 10.2197/ipsjtcva.5.163
10.1109/ICIP.2002.1038998
10.1117/12.2018145
10.1109/ICCVW54120.2021.00456
10.1109/CVPR.2018.00411
10.1007/978-3-030-58529-7_23
10.1109/AFGR.2004.1301502
10.1007/978-3-319-13323-2_3
10.1109/ICCV48922.2021.01465
10.1007/978-3-319-69923-3_51
10.1007/978-3-030-58545-7_22
10.1109/TBIOM.2020.3008862
10.1145/2661229.2661273
10.1016/j.patcog.2019.04.023
10.1109/TPAMI.2014.2366766
10.1145/2816795.2818013
10.1016/j.patcog.2003.09.012
10.1109/CVPR.2012.6247844
10.1109/3DV.2017.00064
10.1007/11744078_12
10.1109/CVPR.2018.00744
10.1049/iet-bmt.2013.0090
10.1016/j.patcog.2019.107069
10.1109/ICB.2012.6199832
10.1109/TPAMI.2019.2929257
10.1109/ICPR.2010.849
10.1109/CVPR.2018.00055
10.1109/CVPR42600.2020.00530
10.1109/TPAMI.2013.248
10.1109/CVPR.2019.00484
10.1016/j.patcog.2018.10.019
10.1109/TPAMI.2006.38
10.1109/TCSVT.2017.2760835
10.1109/ICCV.2019.00445
10.1109/TIP.2014.2371335
10.1109/ICCV.2019.00554
10.1007/978-3-030-69535-4_1
10.1007/978-3-319-46454-1_34
10.1111/j.1556-4029.2011.01793.x
10.1109/ICPR.2006.67
10.1007/978-3-319-46493-0_38
10.1109/ICCV.2017.256
10.1109/CVPR.2017.718
10.1109/TIFS.2019.2912577
10.1186/s41074-018-0039-6
10.1109/CVPR.2019.00483
10.1109/CVPR.2010.5540113
10.1109/AFGR.2004.1301521
10.1609/aaai.v33i01.33018126
10.1109/CVPR42600.2020.01423
10.1016/S1077-3142(03)00008-0
10.1109/TIFS.2012.2204253
10.1109/TPAMI.2016.2545669
10.3115/v1/D14-1179
10.1109/CVPR.2001.990506
10.1109/ICB.2016.7550060
10.1109/TIFS.2018.2844819
10.1109/TPAMI.2005.39
10.1109/EST.2010.19
10.1109/CVPR.2014.323
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
DOI 10.1109/TBIOM.2022.3174559
DatabaseName IEEE Xplore (IEEE)
IEEE Xplore Open Access (Activated by CARLI)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Biology
EISSN 2637-6407
EndPage 248
ExternalDocumentID 10_1109_TBIOM_2022_3174559
9773349
Genre orig-research
GrantInformation_xml – fundername: JSPS KAKENHI
  grantid: JP19H05692; JP20H00607
GroupedDBID 0R~
97E
AAJGR
AASAJ
AAWTH
ABAZT
ABJNI
ABQJQ
ABVLG
AGQYO
AHBIQ
AKJIK
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
ESBDL
IFIPE
JAVBF
OCL
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
ID FETCH-LOGICAL-c339t-a387302a25a6fe1e6b14905c6e1ae973bf4097c45cdcd732558ab7d0336f70a93
IEDL.DBID RIE
ISICitedReferencesCount 31
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001301688600009&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2637-6407
IngestDate Sun Nov 30 04:59:01 EST 2025
Sat Nov 29 04:02:38 EST 2025
Tue Nov 18 19:50:36 EST 2025
Wed Aug 27 02:23:54 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 2
Language English
License https://creativecommons.org/licenses/by-nc-nd/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c339t-a387302a25a6fe1e6b14905c6e1ae973bf4097c45cdcd732558ab7d0336f70a93
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-6036-5763
0000-0002-8044-7050
0000-0002-3546-8071
OpenAccessLink https://ieeexplore.ieee.org/document/9773349
PQID 2679398168
PQPubID 4437219
PageCount 15
ParticipantIDs crossref_citationtrail_10_1109_TBIOM_2022_3174559
proquest_journals_2679398168
crossref_primary_10_1109_TBIOM_2022_3174559
ieee_primary_9773349
PublicationCentury 2000
PublicationDate 2022-04-01
PublicationDateYYYYMMDD 2022-04-01
PublicationDate_xml – month: 04
  year: 2022
  text: 2022-04-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on biometrics, behavior, and identity science
PublicationTitleAbbrev TBIOM
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref57
ref12
ref56
ref59
ref14
ref58
ref53
ref52
ref11
ref55
ref10
ref17
ref16
ref19
ref18
ref51
ref50
ref46
ref45
ref48
ref47
ref42
ref41
ref44
ref43
ref49
Kingma (ref61) 2014
ref8
ref7
ref9
ref4
ref3
ref6
ref5
ref40
ref35
ref37
ref36
ref31
ref30
ref33
ref32
ref2
ref1
ref39
ref38
Makihara (ref15)
ref24
ref23
ref26
ref25
ref20
ref64
ref63
ref22
ref21
ref65
Shin (ref54) 2020
Gross (ref34) 2001
ref28
ref27
ref29
ref60
References_xml – ident: ref2
  doi: 10.2197/ipsjtcva.5.163
– ident: ref37
  doi: 10.1109/ICIP.2002.1038998
– ident: ref39
  doi: 10.1117/12.2018145
– ident: ref45
  doi: 10.1109/ICCVW54120.2021.00456
– ident: ref60
  doi: 10.1109/CVPR.2018.00411
– ident: ref58
  doi: 10.1007/978-3-030-58529-7_23
– ident: ref21
  doi: 10.1109/AFGR.2004.1301502
– ident: ref38
  doi: 10.1007/978-3-319-13323-2_3
– ident: ref65
  doi: 10.1109/ICCV48922.2021.01465
– ident: ref30
  doi: 10.1007/978-3-319-69923-3_51
– ident: ref64
  doi: 10.1007/978-3-030-58545-7_22
– ident: ref43
  doi: 10.1109/TBIOM.2020.3008862
– ident: ref57
  doi: 10.1145/2661229.2661273
– ident: ref33
  doi: 10.1016/j.patcog.2019.04.023
– ident: ref5
  doi: 10.1109/TPAMI.2014.2366766
– ident: ref46
  doi: 10.1145/2816795.2818013
– ident: ref22
  doi: 10.1016/j.patcog.2003.09.012
– ident: ref18
  doi: 10.1109/CVPR.2012.6247844
– ident: ref56
  doi: 10.1109/3DV.2017.00064
– ident: ref14
  doi: 10.1007/11744078_12
– ident: ref49
  doi: 10.1109/CVPR.2018.00744
– ident: ref3
  doi: 10.1049/iet-bmt.2013.0090
– ident: ref31
  doi: 10.1016/j.patcog.2019.107069
– ident: ref26
  doi: 10.1109/ICB.2012.6199832
– ident: ref27
  doi: 10.1109/TPAMI.2019.2929257
– ident: ref25
  doi: 10.1109/ICPR.2010.849
– ident: ref48
  doi: 10.1109/CVPR.2018.00055
– ident: ref50
  doi: 10.1109/CVPR42600.2020.00530
– ident: ref55
  doi: 10.1109/TPAMI.2013.248
– start-page: 717
  volume-title: Proc. CVPR
  ident: ref15
  article-title: Silhouette transformation based on walking speed for gait identification
– ident: ref41
  doi: 10.1109/CVPR.2019.00484
– ident: ref19
  doi: 10.1016/j.patcog.2018.10.019
– ident: ref4
  doi: 10.1109/TPAMI.2006.38
– year: 2020
  ident: ref54
  article-title: Multi-view human pose and shape estimation using learnable volumetric aggregation
  publication-title: arXiv:2011.13427
– year: 2014
  ident: ref61
  article-title: Adam: A method for stochastic optimization
  publication-title: arXiv:1412.6980
– ident: ref9
  doi: 10.1109/TCSVT.2017.2760835
– ident: ref53
  doi: 10.1109/ICCV.2019.00445
– ident: ref16
  doi: 10.1109/TIP.2014.2371335
– ident: ref52
  doi: 10.1109/ICCV.2019.00554
– ident: ref44
  doi: 10.1007/978-3-030-69535-4_1
– ident: ref47
  doi: 10.1007/978-3-319-46454-1_34
– ident: ref1
  doi: 10.1111/j.1556-4029.2011.01793.x
– ident: ref32
  doi: 10.1109/ICPR.2006.67
– ident: ref59
  doi: 10.1007/978-3-319-46493-0_38
– ident: ref28
  doi: 10.1109/ICCV.2017.256
– ident: ref6
  doi: 10.1109/CVPR.2017.718
– ident: ref12
  doi: 10.1109/TIFS.2019.2912577
– ident: ref29
  doi: 10.1186/s41074-018-0039-6
– ident: ref10
  doi: 10.1109/CVPR.2019.00483
– ident: ref13
  doi: 10.1109/CVPR.2010.5540113
– ident: ref35
  doi: 10.1109/AFGR.2004.1301521
– ident: ref11
  doi: 10.1609/aaai.v33i01.33018126
– ident: ref63
  doi: 10.1109/CVPR42600.2020.01423
– ident: ref24
  doi: 10.1016/S1077-3142(03)00008-0
– ident: ref42
  doi: 10.1109/TIFS.2012.2204253
– ident: ref8
  doi: 10.1109/TPAMI.2016.2545669
– ident: ref51
  doi: 10.3115/v1/D14-1179
– ident: ref23
  doi: 10.1109/CVPR.2001.990506
– ident: ref7
  doi: 10.1109/ICB.2016.7550060
– ident: ref20
  doi: 10.1109/TIFS.2018.2844819
– ident: ref36
  doi: 10.1109/TPAMI.2005.39
– ident: ref40
  doi: 10.1109/EST.2010.19
– ident: ref17
  doi: 10.1109/CVPR.2014.323
– year: 2001
  ident: ref34
  article-title: The CMU motion of body (MoBo) database
SSID ssj0002049049
Score 2.3884916
Snippet Existing model-based gait databases provide the 2D poses (i.e., joint locations) extracted by general pose estimators as the human model. However, these 2D...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 234
SubjectTerms Asynchronous multi-view sequences
Finite element method
gait database
Gait recognition
Human performance
Performance evaluation
Shape
Skeleton
Solid modeling
Three dimensional models
Three-dimensional displays
three-dimensional human pose/shape estimation
Videos
Title Multi-View Large Population Gait Database With Human Meshes and Its Performance Evaluation
URI https://ieeexplore.ieee.org/document/9773349
https://www.proquest.com/docview/2679398168
Volume 4
WOSCitedRecordID wos001301688600009&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2637-6407
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002049049
  issn: 2637-6407
  databaseCode: RIE
  dateStart: 20190101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB50UfDiW1xdJQdvWu02abI5-lbwsQdfeClJOsEFWcVWxX9vknaroAjeekhKmelk8iUz3wewYZmVOjE8MjFXEWMWI51aF1dUcqVNTBVWYhPi4qJ3dyf7Y7DV9MIgYig-w23_GO7y8yfz6o_KdtxehVImx2FcCF71ajXnKYm_wmJy1BcTy52rvdPLc4cAk8QBU8FST0f6LfcEMZUfK3BIK0cz__ugWZiut49kt_L3HIzhcB4mK0HJjwW4D_200c0A38mZr_Em_UagixyrQUkOVKl85iK3g_KBhCN8co7FAxZEDXNyWhak_9VLQA4bMvBFuD46vNo_iWr1hMhQKstI0Z6L3kQlqeIWu8i1A0Nxajh2FUpBtfVUV4alJje5oA5a9JQWeUwptyJWki5Ba_g0xGUgjGqNVKCHL0w4hGa15Z4Yj6c0Rmna0B3ZNTM1tbhXuHjMAsSIZRZ8kXlfZLUv2rDZzHmuiDX-HL3grd-MrA3fhs7IfVkde0WWcLfmSK8nsvL7rFWY8u-u6m860CpfXnENJsxbOShe1sNv9QkHc8rX
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NT9wwEB0BBbUXyqdYoOBDb5CuN3ac9ZFvVt1d9rAFxCWynbFYqVoQCUX8e2wnmyK1qtRbDrYSzWQ8fvbMewBfLbdSx0ZEhgoVcW4x0ol1ccWkUNpQprASm0iHw-7trRzNwWHTC4OIofgMv_nHcJefP5hnf1TWdnsVxrichw8J5zGturWaE5XYX2JxOeuMobI9Pu5dDRwGjGMHTVOeeELSd9knyKn8sQaHxHL--f8-aQWW6w0kOao8vgpzOF2DpUpS8nUd7kJHbXQ9wRfS91XeZNRIdJELNSnJqSqVz13kZlLek3CITwZY3GNB1DQnvbIgo9_dBOSsoQPfgB_nZ-OTy6jWT4gMY7KMFOu6-I1VnChhsYNCOzhEEyOwo1CmTFtPdmV4YnKTp8yBi67SaU4ZEzalSrJNWJg-THELCGdaI0vRAxieOoxmtRWeGk8kjKI0LejM7JqZmlzca1z8zALIoDILvsi8L7LaFy04aOY8VtQa_xy97q3fjKwN34LdmfuyOvqKLBZu1ZFeUWT777P24ePleNDP-r3h9x345N9TVePswkL59IxfYNH8KifF0174xd4Av0jOHg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multi-View+Large+Population+Gait+Database+With+Human+Meshes+and+Its+Performance+Evaluation&rft.jtitle=IEEE+transactions+on+biometrics%2C+behavior%2C+and+identity+science&rft.au=Li%2C+Xiang&rft.au=Makihara%2C+Yasushi&rft.au=Xu%2C+Chi&rft.au=Yagi%2C+Yasushi&rft.date=2022-04-01&rft.pub=IEEE&rft.eissn=2637-6407&rft.volume=4&rft.issue=2&rft.spage=234&rft.epage=248&rft_id=info:doi/10.1109%2FTBIOM.2022.3174559&rft.externalDocID=9773349
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2637-6407&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2637-6407&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2637-6407&client=summon