Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments

We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence Jg. 36; H. 7; S. 1325 - 1339
Hauptverfasser: Ionescu, Catalin, Papava, Dragos, Olaru, Vlad, Sminchisescu, Cristian
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Los Alamitos, CA IEEE 01.07.2014
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:0162-8828, 1939-3539, 2160-9292, 1939-3539
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.
AbstractList We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.
We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.
We introduce a new dataset, Human3.6M, of 3.6 Million 3D Human poses, acquired by recording the performance of 11 subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models. Besides increasing the size the current state of the art datasets by several orders of magnitude, we aim to complement such datasets with a diverse set of poses encountered in typical human activities (taking photos, posing, greeting, eating, etc.), with synchronized image, motion capture and depth data, and with accurate 3D body scans of all subjects involved. We also provide mixed reality videos where 3D human models are animated using motion capture data and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide large scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future workin the research community. The dataset and code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, are available online at http://vision.imar.ro/human3.6m.
Author Olaru, Vlad
Ionescu, Catalin
Papava, Dragos
Sminchisescu, Cristian
Author_xml – sequence: 1
  givenname: Catalin
  surname: Ionescu
  fullname: Ionescu, Catalin
  email: catalin.ionescu@ins.uni-bonn.de
  organization: Inst. of Math., Bucharest, Romania
– sequence: 2
  givenname: Dragos
  surname: Papava
  fullname: Papava, Dragos
  email: dragos.papava@imar.ro
  organization: Inst. of Math., Bucharest, Romania
– sequence: 3
  givenname: Vlad
  surname: Olaru
  fullname: Olaru, Vlad
  email: vlad.olaru@imar.ro
  organization: Inst. of Math., Bucharest, Romania
– sequence: 4
  givenname: Cristian
  surname: Sminchisescu
  fullname: Sminchisescu, Cristian
  email: cristian.sminchisescu@math.lth.se
  organization: Dept. of Math., Lund Univ., Lund, Sweden
BackLink http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=28603589$$DView record in Pascal Francis
https://www.ncbi.nlm.nih.gov/pubmed/26353306$$D View this record in MEDLINE/PubMed
BookMark eNp1ks9rFDEUxwep2B969SJIQAQvuyaTTCbxVtpqC7taaD2Hl8ybNmU2syaZiv-96e62QsFLcvl830ve-xxWe2EMWFVvGZ0zRvXn68vj5cW8pozPa6FeVAc1k3Sma13vVQeUyXqmVK32q8OU7ihloqH8VbVfS95wTuVB5c6nFQQ-l8svZAHxBsmVgwHJKWRImBOB0JHLiJ132d8jWWK-HbtE-jESfko2aXKFIflwQ3wg3yFPEQZyFu59HMMKQ06vq5c9DAnf7O6j6ufXs-uT89nix7eLk-PFzDW1zjNupZRaABeWWWU57bWSSitAYXvaUt13PbiWWc007xS1wgqnJC0M6q5j_KiCbd30G9eTNevoVxD_mBG8WY8xw2AiJoTobs0wmYSmUIN3kP0YkpHQ1lpTbZi2jRGq1UbTcjAmKOWasdby0uPTtsc6jr8mTNmsfHI4DBBwnJJhLWMNl5rRgn54ht6NUwxlAoY1UrRc1q0o1PsdNdkVdk-PflxRAT7uAEhlNX2E4Hz6x5UJ8Ebpwokt5-KYUsTeOJ83X8sR_GAYNQ_GmI0x5sEYU4wpsfmz2GPl_wbebQMeEZ9gKYtmWvO_NLbHwQ
CODEN ITPIDJ
CitedBy_id crossref_primary_10_1016_j_neucom_2021_11_021
crossref_primary_10_1016_j_jfranklin_2025_107648
crossref_primary_10_1007_s00371_023_03088_2
crossref_primary_10_1007_s11042_022_12802_6
crossref_primary_10_1109_TVCG_2023_3308753
crossref_primary_10_1109_LRA_2020_3010742
crossref_primary_10_3390_s24144467
crossref_primary_10_3390_s24237772
crossref_primary_10_1007_s00521_020_04941_4
crossref_primary_10_1631_FITEE_2200529
crossref_primary_10_3390_app15020906
crossref_primary_10_1109_TIP_2015_2487860
crossref_primary_10_1109_TPAMI_2016_2526002
crossref_primary_10_1145_3716137
crossref_primary_10_1109_TPAMI_2021_3051173
crossref_primary_10_1109_TIP_2023_3334954
crossref_primary_10_1109_LRA_2024_3374188
crossref_primary_10_1109_TIP_2021_3108708
crossref_primary_10_1109_ACCESS_2024_3488038
crossref_primary_10_1109_TPAMI_2024_3407570
crossref_primary_10_1007_s00371_025_04010_8
crossref_primary_10_1109_TVCG_2023_3277918
crossref_primary_10_1007_s00521_020_05086_0
crossref_primary_10_1109_JSEN_2025_3577772
crossref_primary_10_1109_ACCESS_2024_3423765
crossref_primary_10_1109_LSP_2025_3584020
crossref_primary_10_1007_s00371_021_02339_4
crossref_primary_10_1038_s41467_023_44141_x
crossref_primary_10_3390_electronics14122362
crossref_primary_10_1016_j_neucom_2024_128153
crossref_primary_10_1145_3663759
crossref_primary_10_1002_cpe_6927
crossref_primary_10_1007_s00521_023_08844_y
crossref_primary_10_1016_j_eswa_2017_05_067
crossref_primary_10_1145_3716387
crossref_primary_10_1007_s00530_021_00808_3
crossref_primary_10_1007_s11263_016_0888_3
crossref_primary_10_1242_jeb_245122
crossref_primary_10_1111_cgf_14150
crossref_primary_10_20965_jrm_2021_p0537
crossref_primary_10_1109_ACCESS_2024_3375365
crossref_primary_10_1111_cgf_14151
crossref_primary_10_1016_j_aej_2025_05_067
crossref_primary_10_1162_jocn_a_02233
crossref_primary_10_1109_TSMC_2024_3525021
crossref_primary_10_1016_j_neucom_2021_11_007
crossref_primary_10_1109_TPAMI_2021_3139918
crossref_primary_10_1007_s11263_020_01328_9
crossref_primary_10_1109_TNSRE_2022_3219085
crossref_primary_10_1016_j_autcon_2024_105452
crossref_primary_10_1109_TIM_2025_3569914
crossref_primary_10_1145_3687962
crossref_primary_10_1016_j_jsv_2025_118931
crossref_primary_10_1109_JBHI_2024_3384453
crossref_primary_10_1177_14727978251380798
crossref_primary_10_1002_rob_21931
crossref_primary_10_1016_j_engappai_2023_107565
crossref_primary_10_1109_TVCG_2021_3085407
crossref_primary_10_1016_j_aei_2025_103203
crossref_primary_10_1016_j_rcim_2023_102691
crossref_primary_10_1109_ACCESS_2020_2995383
crossref_primary_10_54097_28jqyv71
crossref_primary_10_1109_ACCESS_2024_3491655
crossref_primary_10_1109_TPAMI_2021_3055560
crossref_primary_10_3390_electronics12204273
crossref_primary_10_1145_3687772
crossref_primary_10_1007_s00530_025_01691_y
crossref_primary_10_1111_mice_13129
crossref_primary_10_3390_jimaging9120275
crossref_primary_10_1109_ACCESS_2020_3040418
crossref_primary_10_1109_TIP_2014_2364113
crossref_primary_10_1016_j_neucom_2021_10_011
crossref_primary_10_1016_j_patcog_2023_109908
crossref_primary_10_1109_ACCESS_2021_3132148
crossref_primary_10_1016_j_neucom_2018_10_009
crossref_primary_10_1007_s00371_023_03184_3
crossref_primary_10_1016_j_rcim_2025_103089
crossref_primary_10_3390_jimaging9120262
crossref_primary_10_1080_00140139_2024_2308705
crossref_primary_10_1145_3272127_3275108
crossref_primary_10_1016_j_knosys_2022_109691
crossref_primary_10_1038_s41598_025_91426_w
crossref_primary_10_1111_mice_13139
crossref_primary_10_1007_s00371_025_04041_1
crossref_primary_10_3233_THC_240639
crossref_primary_10_1109_JBHI_2018_2872834
crossref_primary_10_1016_j_smhl_2025_100610
crossref_primary_10_1109_TCSVT_2024_3415348
crossref_primary_10_1109_TPAMI_2024_3361472
crossref_primary_10_1109_TPAMI_2022_3163709
crossref_primary_10_1007_s13042_024_02254_9
crossref_primary_10_1007_s11760_024_03589_0
crossref_primary_10_3390_s20185224
crossref_primary_10_1016_j_jvcir_2024_104174
crossref_primary_10_1109_ACCESS_2025_3538332
crossref_primary_10_1016_j_patcog_2025_111562
crossref_primary_10_1007_s10462_022_10365_4
crossref_primary_10_3390_jimaging9100204
crossref_primary_10_1109_ACCESS_2025_3596651
crossref_primary_10_1109_TMM_2024_3521780
crossref_primary_10_3390_app13042460
crossref_primary_10_1038_s41598_025_16381_y
crossref_primary_10_1007_s11548_022_02691_3
crossref_primary_10_1016_j_gmod_2020_101072
crossref_primary_10_1145_3637060
crossref_primary_10_1186_s41074_019_0057_z
crossref_primary_10_1007_s00371_019_01692_9
crossref_primary_10_1109_LSP_2021_3134136
crossref_primary_10_1016_j_dsp_2024_104936
crossref_primary_10_1007_s00530_022_00980_0
crossref_primary_10_1038_s41467_023_41565_3
crossref_primary_10_1109_TCSVT_2023_3310525
crossref_primary_10_1016_j_knosys_2023_110948
crossref_primary_10_1002_ail2_63
crossref_primary_10_1002_cav_1958
crossref_primary_10_3390_s21165613
crossref_primary_10_3390_s24134422
crossref_primary_10_1016_j_eswa_2025_127415
crossref_primary_10_1016_j_cviu_2024_104149
crossref_primary_10_1016_j_neucom_2017_07_029
crossref_primary_10_1109_TPAMI_2024_3355414
crossref_primary_10_1016_j_cviu_2024_104147
crossref_primary_10_1109_TPAMI_2021_3050918
crossref_primary_10_1007_s00530_023_01085_y
crossref_primary_10_3389_fnbot_2025_1531894
crossref_primary_10_3390_app10186188
crossref_primary_10_1007_s10845_023_02318_7
crossref_primary_10_1016_j_cag_2022_07_021
crossref_primary_10_1109_ACCESS_2020_3013917
crossref_primary_10_1007_s11044_024_10021_5
crossref_primary_10_1109_TCSVT_2023_3239322
crossref_primary_10_1145_3743138
crossref_primary_10_3390_s22134987
crossref_primary_10_1145_3476576_3476640
crossref_primary_10_32604_cmes_2023_027676
crossref_primary_10_1016_j_patrec_2025_01_016
crossref_primary_10_1007_s10489_025_06532_z
crossref_primary_10_1016_j_neuron_2020_11_016
crossref_primary_10_1109_TCDS_2023_3334302
crossref_primary_10_1016_j_patcog_2020_107534
crossref_primary_10_1007_s00371_023_03193_2
crossref_primary_10_1109_JIOT_2024_3502653
crossref_primary_10_1049_ipr2_13201
crossref_primary_10_1007_s13735_022_00261_6
crossref_primary_10_1109_ACCESS_2024_3375767
crossref_primary_10_1109_TITS_2023_3276345
crossref_primary_10_1016_j_eswa_2025_128768
crossref_primary_10_1016_j_media_2024_103208
crossref_primary_10_1007_s11042_017_4847_y
crossref_primary_10_1016_j_patcog_2024_111206
crossref_primary_10_1007_s10489_023_05143_w
crossref_primary_10_1007_s00530_024_01441_6
crossref_primary_10_1109_TVCG_2023_3284402
crossref_primary_10_1109_TVCG_2024_3391764
crossref_primary_10_3390_e22080806
crossref_primary_10_1016_j_knosys_2025_113391
crossref_primary_10_1109_TCSVT_2021_3057267
crossref_primary_10_1007_s41095_021_0234_8
crossref_primary_10_1007_s11548_019_02044_7
crossref_primary_10_1145_3460199
crossref_primary_10_1016_j_engappai_2023_106639
crossref_primary_10_1145_3487891
crossref_primary_10_1007_s11042_022_14156_5
crossref_primary_10_1145_3709001
crossref_primary_10_1016_j_iot_2024_101134
crossref_primary_10_1109_TVCG_2019_2898650
crossref_primary_10_3390_s19122809
crossref_primary_10_7717_peerj_cs_1401
crossref_primary_10_1177_10711813251366294
crossref_primary_10_1016_j_cag_2025_104244
crossref_primary_10_3390_s22145419
crossref_primary_10_1016_j_eswa_2023_120213
crossref_primary_10_1109_ACCESS_2025_3542240
crossref_primary_10_3390_s20040976
crossref_primary_10_1080_21681163_2023_2292067
crossref_primary_10_1111_cgf_15063
crossref_primary_10_1109_ACCESS_2021_3076595
crossref_primary_10_1016_j_jvcir_2025_104432
crossref_primary_10_1007_s00138_024_01530_6
crossref_primary_10_1109_TIP_2020_2972104
crossref_primary_10_1109_TCSVT_2021_3081591
crossref_primary_10_1145_3658137
crossref_primary_10_1016_j_isatra_2023_04_022
crossref_primary_10_1109_TKDE_2024_3466301
crossref_primary_10_3390_s23249749
crossref_primary_10_1109_TCSVT_2024_3518054
crossref_primary_10_1109_TPAMI_2019_2894422
crossref_primary_10_3390_s20154257
crossref_primary_10_1109_TIP_2021_3116786
crossref_primary_10_1109_TCE_2024_3363616
crossref_primary_10_1145_2980179_2980240
crossref_primary_10_1109_TCE_2023_3303309
crossref_primary_10_1002_cav_2187
crossref_primary_10_1007_s11263_020_01306_1
crossref_primary_10_1109_TCSVT_2023_3298978
crossref_primary_10_1109_TVCG_2023_3247088
crossref_primary_10_1007_s11263_021_01436_0
crossref_primary_10_1109_TPAMI_2015_2420563
crossref_primary_10_3390_app11094143
crossref_primary_10_1109_TMM_2023_3330075
crossref_primary_10_1109_TPAMI_2020_3042341
crossref_primary_10_1145_3757733
crossref_primary_10_3390_s24030829
crossref_primary_10_3390_s25051613
crossref_primary_10_1109_TVCG_2019_2936810
crossref_primary_10_14293_NSM_25_1_0007
crossref_primary_10_1007_s00521_024_09817_5
crossref_primary_10_1109_JSTARS_2025_3601739
crossref_primary_10_1109_LSP_2024_3358111
crossref_primary_10_3390_make6020040
crossref_primary_10_1109_ACCESS_2024_3434695
crossref_primary_10_1007_s00138_020_01131_z
crossref_primary_10_1016_j_apergo_2023_104176
crossref_primary_10_1016_j_daai_2025_100007
crossref_primary_10_1145_3478513_3480504
crossref_primary_10_1145_3580874
crossref_primary_10_1016_j_patcog_2025_111964
crossref_primary_10_1016_j_ins_2020_08_123
crossref_primary_10_1016_j_cviu_2018_03_007
crossref_primary_10_1109_TMM_2023_3240455
crossref_primary_10_1109_ACCESS_2020_2985318
crossref_primary_10_3390_app11094153
crossref_primary_10_1016_j_autcon_2025_106372
crossref_primary_10_1051_itmconf_20257002019
crossref_primary_10_1145_3432701
crossref_primary_10_1109_TCSVT_2020_2987141
crossref_primary_10_1016_j_patcog_2024_110316
crossref_primary_10_1016_j_neunet_2022_07_005
crossref_primary_10_7717_peerj_cs_2304
crossref_primary_10_1007_s00138_020_01120_2
crossref_primary_10_1038_s44172_024_00272_7
crossref_primary_10_1016_j_neunet_2024_106153
crossref_primary_10_1016_j_artmed_2025_103194
crossref_primary_10_3390_app122010591
crossref_primary_10_1016_j_knosys_2025_114046
crossref_primary_10_1109_ACCESS_2023_3341418
crossref_primary_10_3389_fnbot_2024_1371385
crossref_primary_10_1109_TPAMI_2021_3098052
crossref_primary_10_1007_s10489_023_04666_6
crossref_primary_10_1016_j_jer_2025_07_007
crossref_primary_10_1016_j_eswa_2024_124914
crossref_primary_10_3389_fnhum_2022_768575
crossref_primary_10_1016_j_jvcir_2025_104409
crossref_primary_10_1109_TPAMI_2015_2496166
crossref_primary_10_1016_j_apergo_2025_104513
crossref_primary_10_1109_ACCESS_2023_3258417
crossref_primary_10_1109_TPAMI_2019_2892452
crossref_primary_10_1371_journal_pone_0253157
crossref_primary_10_1109_TNNLS_2022_3170642
crossref_primary_10_3390_s24082567
crossref_primary_10_1109_TPAMI_2022_3155712
crossref_primary_10_1016_j_knosys_2024_112717
crossref_primary_10_1007_s11042_023_17736_1
crossref_primary_10_1016_j_dsp_2025_105539
crossref_primary_10_1016_j_patrec_2019_05_020
crossref_primary_10_1109_TCSVT_2023_3279291
crossref_primary_10_1007_s11263_021_01525_0
crossref_primary_10_1016_j_patcog_2023_109380
crossref_primary_10_1109_TMM_2022_3167887
crossref_primary_10_1007_s11760_025_04366_3
crossref_primary_10_1109_ACCESS_2021_3062545
crossref_primary_10_1016_j_sigpro_2024_109551
crossref_primary_10_1109_TNNLS_2023_3277476
crossref_primary_10_1016_j_imavis_2025_105437
crossref_primary_10_1007_s11042_017_5133_8
crossref_primary_10_1109_TMI_2020_2991954
crossref_primary_10_1016_j_jobe_2021_102220
crossref_primary_10_1007_s11432_023_4284_x
crossref_primary_10_1007_s11042_020_09733_5
crossref_primary_10_1049_ipr2_12847
crossref_primary_10_1109_TPAMI_2020_3019139
crossref_primary_10_1007_s11263_023_01749_2
crossref_primary_10_1007_s00371_021_02135_0
crossref_primary_10_2478_jaiscr_2022_0019
crossref_primary_10_3390_s22197264
crossref_primary_10_1016_j_jvcir_2023_103890
crossref_primary_10_7717_peerj_12995
crossref_primary_10_1109_LRA_2019_2896705
crossref_primary_10_1109_TCSVT_2024_3461960
crossref_primary_10_1007_s11227_025_07486_2
crossref_primary_10_1007_s11801_025_4119_4
crossref_primary_10_1016_j_neucom_2019_01_104
crossref_primary_10_3724_SP_J_1089_2022_19196
crossref_primary_10_1109_TETCI_2024_3386840
crossref_primary_10_1109_TIP_2021_3108023
crossref_primary_10_1007_s13198_024_02640_0
crossref_primary_10_3724_SP_J_1089_2022_19194
crossref_primary_10_1016_j_cag_2023_07_011
crossref_primary_10_1109_TASE_2023_3301657
crossref_primary_10_1109_ACCESS_2024_3392356
crossref_primary_10_1007_s11263_020_01398_9
crossref_primary_10_1109_TCSVT_2020_3021409
crossref_primary_10_1016_j_cviu_2023_103836
crossref_primary_10_1016_j_imavis_2024_105142
crossref_primary_10_32604_cmc_2024_059284
crossref_primary_10_1016_j_imavis_2024_105144
crossref_primary_10_1016_j_cviu_2021_103278
crossref_primary_10_1109_ACCESS_2020_3034594
crossref_primary_10_3390_app15010409
crossref_primary_10_1016_j_jbiomech_2024_111959
crossref_primary_10_3390_app9173613
crossref_primary_10_1007_s00530_024_01460_3
crossref_primary_10_1016_j_patcog_2023_110116
crossref_primary_10_1109_LSP_2020_3033462
crossref_primary_10_3390_electronics10080929
crossref_primary_10_3390_s22072481
crossref_primary_10_1016_j_cviu_2021_103275
crossref_primary_10_3390_app14145981
crossref_primary_10_1038_s41597_023_02663_5
crossref_primary_10_1038_s41597_025_04560_5
crossref_primary_10_1016_j_cag_2023_07_001
crossref_primary_10_1016_j_neucom_2023_126388
crossref_primary_10_1016_j_cviu_2025_104297
crossref_primary_10_3390_inventions4010009
crossref_primary_10_1109_JSEN_2025_3542078
crossref_primary_10_3390_electronics12234793
crossref_primary_10_1049_ipr2_12400
crossref_primary_10_1016_j_knosys_2024_111613
crossref_primary_10_1016_j_patcog_2023_109358
crossref_primary_10_1016_j_patcog_2020_107472
crossref_primary_10_1109_TCSVT_2024_3423411
crossref_primary_10_1111_cgf_14961
crossref_primary_10_1016_j_neucom_2024_128605
crossref_primary_10_1109_TCSVT_2022_3199201
crossref_primary_10_3390_s24175530
crossref_primary_10_1145_3737647
crossref_primary_10_3390_s23031555
crossref_primary_10_1016_j_neucom_2022_02_076
crossref_primary_10_1145_3528223_3530106
crossref_primary_10_3390_s23020876
crossref_primary_10_3390_app11209724
crossref_primary_10_2478_amns_2023_1_00149
crossref_primary_10_1109_ACCESS_2024_3484660
crossref_primary_10_1007_s11263_024_02095_7
crossref_primary_10_1109_ACCESS_2024_3355137
crossref_primary_10_1109_TPAMI_2021_3123902
crossref_primary_10_3390_electronics14132636
crossref_primary_10_1038_s44182_025_00048_x
crossref_primary_10_1007_s11263_019_01245_6
crossref_primary_10_1007_s13735_024_00341_9
crossref_primary_10_1016_j_patcog_2022_108934
crossref_primary_10_1016_j_sigpro_2024_109598
crossref_primary_10_1109_TPAMI_2024_3443922
crossref_primary_10_1016_j_tics_2021_11_008
crossref_primary_10_1007_s12369_024_01136_y
crossref_primary_10_1109_TPAMI_2020_2976014
crossref_primary_10_3390_s25030627
crossref_primary_10_1109_TNNLS_2021_3053591
crossref_primary_10_1016_j_cviu_2022_103548
crossref_primary_10_1109_JIOT_2024_3507072
crossref_primary_10_3390_app13169475
crossref_primary_10_1109_LRA_2025_3536222
crossref_primary_10_1016_j_cviu_2024_104190
crossref_primary_10_1109_TMM_2022_3141231
crossref_primary_10_1109_TPAMI_2025_3535538
crossref_primary_10_1007_s00371_025_04101_6
crossref_primary_10_1007_s10489_022_03714_x
crossref_primary_10_20965_jrm_2023_p0586
crossref_primary_10_1016_j_cviu_2022_103539
crossref_primary_10_1007_s00530_024_01351_7
crossref_primary_10_3390_electronics12153305
crossref_primary_10_1016_j_eswa_2025_127189
crossref_primary_10_1016_j_cviu_2019_03_004
crossref_primary_10_1109_TASE_2023_3279928
crossref_primary_10_1109_TPAMI_2024_3388042
crossref_primary_10_1007_s10489_021_02764_x
crossref_primary_10_1016_j_cviu_2023_103830
crossref_primary_10_1109_ACCESS_2025_3567337
crossref_primary_10_1002_tee_22902
crossref_primary_10_1109_TPAMI_2016_2646685
crossref_primary_10_1111_cgf_70018
crossref_primary_10_3390_app11041826
crossref_primary_10_1109_TPAMI_2020_3045007
crossref_primary_10_1109_TPAMI_2023_3298850
crossref_primary_10_1145_3618336
crossref_primary_10_1080_14763141_2022_2137425
crossref_primary_10_1109_TPAMI_2022_3199449
crossref_primary_10_1016_j_patcog_2021_108439
crossref_primary_10_1016_j_asoc_2025_113688
crossref_primary_10_1007_s11042_023_17923_0
crossref_primary_10_1007_s11263_020_01389_w
crossref_primary_10_1007_s11801_022_2015_8
crossref_primary_10_1038_s41597_023_02554_9
crossref_primary_10_3390_s21134572
crossref_primary_10_1016_j_neucom_2022_02_045
crossref_primary_10_1109_ACCESS_2020_3037829
crossref_primary_10_3390_rs15040953
crossref_primary_10_1016_j_cviu_2022_103505
crossref_primary_10_1016_j_patcog_2025_112096
crossref_primary_10_1109_TMM_2018_2885235
crossref_primary_10_1109_TPAMI_2016_2522398
crossref_primary_10_1016_j_eswa_2024_123545
crossref_primary_10_1109_TIP_2024_3490401
crossref_primary_10_1016_j_cviu_2021_103225
crossref_primary_10_1111_cgf_70025
crossref_primary_10_1145_3447239
crossref_primary_10_1109_ACCESS_2025_3554254
crossref_primary_10_1145_3372207
crossref_primary_10_1123_jmld_2025_0002
crossref_primary_10_1145_3747871
crossref_primary_10_1016_j_knosys_2021_107992
crossref_primary_10_1109_TPAMI_2022_3215307
crossref_primary_10_1016_j_patcog_2022_108965
crossref_primary_10_1016_j_patcog_2021_108424
crossref_primary_10_1016_j_inffus_2025_103647
crossref_primary_10_1007_s00530_025_01905_3
crossref_primary_10_1007_s11263_021_01529_w
crossref_primary_10_1109_TMC_2024_3438155
crossref_primary_10_1007_s00530_024_01569_5
crossref_primary_10_1109_ACCESS_2022_3163269
crossref_primary_10_1109_TMM_2023_3269219
crossref_primary_10_1109_TITS_2021_3135251
crossref_primary_10_1007_s11554_021_01095_x
crossref_primary_10_3390_s24010206
crossref_primary_10_26599_TST_2021_9010068
crossref_primary_10_1007_s11263_023_01832_8
crossref_primary_10_1109_JIOT_2024_3470751
crossref_primary_10_1109_TVCG_2019_2938746
crossref_primary_10_3389_fnins_2024_1353257
crossref_primary_10_1016_j_neucom_2024_128663
crossref_primary_10_1016_j_neunet_2019_08_004
crossref_primary_10_1109_TRO_2016_2572685
crossref_primary_10_1145_3272127_3275014
crossref_primary_10_3390_s22114109
crossref_primary_10_1016_j_imavis_2023_104863
crossref_primary_10_1007_s11263_022_01596_7
crossref_primary_10_1109_TMM_2023_3268376
crossref_primary_10_1016_j_jvcir_2019_01_033
crossref_primary_10_1016_j_neucom_2025_130210
crossref_primary_10_1109_TIP_2021_3109517
crossref_primary_10_20965_jaciii_2023_p0445
crossref_primary_10_1109_ACCESS_2019_2934863
crossref_primary_10_1109_LSP_2025_3543317
crossref_primary_10_1016_j_asoc_2025_113676
crossref_primary_10_1145_3181973
crossref_primary_10_1007_s11263_023_01756_3
crossref_primary_10_3390_s24248017
crossref_primary_10_1109_JBHI_2024_3424869
crossref_primary_10_3390_s22239376
crossref_primary_10_1007_s10489_024_05435_9
crossref_primary_10_1145_3524497
crossref_primary_10_1016_j_cviu_2024_103992
crossref_primary_10_1109_LRA_2024_3506220
crossref_primary_10_3390_jimaging11060205
crossref_primary_10_1007_s00138_024_01634_z
crossref_primary_10_1007_s00371_023_02936_5
crossref_primary_10_1109_TIP_2019_2899782
crossref_primary_10_1038_s41592_021_01106_6
crossref_primary_10_1007_s00371_024_03752_1
crossref_primary_10_1016_j_imavis_2023_104841
crossref_primary_10_1109_TCSVT_2020_2995122
crossref_primary_10_1109_TIE_2017_2739691
crossref_primary_10_1145_3514248
crossref_primary_10_1587_transinf_2019MVP0007
crossref_primary_10_1109_ACCESS_2022_3218659
crossref_primary_10_1038_s41597_022_01841_1
crossref_primary_10_3233_JPD_223351
crossref_primary_10_1109_THMS_2025_3539187
crossref_primary_10_3390_s20236940
crossref_primary_10_1016_j_patcog_2023_109998
crossref_primary_10_1145_3575656
crossref_primary_10_1016_j_neucom_2024_128694
crossref_primary_10_3390_electronics13244980
crossref_primary_10_3389_fphy_2021_629288
crossref_primary_10_3390_s24227354
crossref_primary_10_1016_j_patcog_2025_112239
crossref_primary_10_1016_j_dsp_2024_104764
crossref_primary_10_1145_3694905
crossref_primary_10_1134_S1064562423701624
crossref_primary_10_3390_s23125653
crossref_primary_10_1007_s00530_021_00846_x
crossref_primary_10_1007_s11042_024_20365_x
crossref_primary_10_1016_j_jvcir_2020_102923
crossref_primary_10_1109_JSAC_2022_3221995
crossref_primary_10_1007_s11370_021_00403_5
crossref_primary_10_3390_s22052011
crossref_primary_10_1016_j_jvcir_2021_103055
crossref_primary_10_1109_JSEN_2024_3510728
crossref_primary_10_1109_TNNLS_2023_3310985
crossref_primary_10_1007_s10462_024_11060_2
crossref_primary_10_1007_s11760_024_03670_8
crossref_primary_10_1016_j_cviu_2024_103943
crossref_primary_10_1109_TMM_2022_3158068
crossref_primary_10_3390_electronics10182267
crossref_primary_10_1016_j_imavis_2024_104926
crossref_primary_10_1038_s41598_023_41142_0
crossref_primary_10_1109_ACCESS_2021_3049548
crossref_primary_10_1007_s10489_021_02760_1
crossref_primary_10_3390_s20071825
crossref_primary_10_3390_electronics12224705
crossref_primary_10_1016_j_neucom_2025_130400
crossref_primary_10_1016_j_jbiomech_2021_110860
crossref_primary_10_1016_j_aej_2025_02_058
crossref_primary_10_3390_app14114351
crossref_primary_10_1007_s11263_024_01984_1
crossref_primary_10_1007_s11263_022_01713_6
crossref_primary_10_3390_s23042005
crossref_primary_10_1016_j_dib_2025_112044
crossref_primary_10_1016_j_inffus_2024_102651
crossref_primary_10_1109_ACCESS_2025_3593428
crossref_primary_10_1007_s40031_024_01050_x
crossref_primary_10_1016_j_jvcir_2023_103881
crossref_primary_10_1109_ACCESS_2020_3041926
crossref_primary_10_1016_j_cosrev_2025_100780
crossref_primary_10_1109_ACCESS_2022_3219934
crossref_primary_10_1109_TMM_2024_3384062
crossref_primary_10_1007_s00138_024_01514_6
crossref_primary_10_1109_TASE_2025_3548846
crossref_primary_10_1007_s41095_022_0307_3
crossref_primary_10_1016_j_imavis_2021_104282
crossref_primary_10_1007_s13042_024_02262_9
crossref_primary_10_1016_j_engappai_2025_112068
crossref_primary_10_1016_j_autcon_2018_05_033
crossref_primary_10_3390_jimaging8110308
crossref_primary_10_1016_j_media_2023_102871
crossref_primary_10_3390_s22062091
crossref_primary_10_1016_j_imavis_2023_104649
crossref_primary_10_1016_j_media_2023_102887
crossref_primary_10_1016_j_cag_2021_01_010
crossref_primary_10_1109_TMM_2024_3521821
crossref_primary_10_1109_TCYB_2020_2964992
crossref_primary_10_1016_j_gmod_2023_101207
crossref_primary_10_1111_cgf_14768
crossref_primary_10_1038_s41598_025_11073_z
crossref_primary_10_1016_j_patcog_2023_109714
crossref_primary_10_1109_TMM_2022_3222681
crossref_primary_10_1109_TSMC_2017_2664891
crossref_primary_10_1016_j_knosys_2022_109549
crossref_primary_10_1016_j_jbiomech_2020_110086
crossref_primary_10_1016_j_cviu_2016_09_002
crossref_primary_10_1007_s00521_023_08362_x
crossref_primary_10_1007_s11263_021_01540_1
crossref_primary_10_1109_LRA_2025_3586011
crossref_primary_10_1109_TMM_2022_3171102
crossref_primary_10_1016_j_aiopen_2025_08_002
crossref_primary_10_1007_s10459_024_10369_5
crossref_primary_10_1007_s11263_016_0941_2
crossref_primary_10_1145_3533384
crossref_primary_10_1016_j_neucom_2024_128023
crossref_primary_10_1016_j_neucom_2023_01_090
crossref_primary_10_1007_s13042_025_02646_5
crossref_primary_10_1007_s00371_023_03152_x
crossref_primary_10_1109_ACCESS_2021_3126629
crossref_primary_10_1109_JSEN_2023_3308920
crossref_primary_10_1016_j_jvcir_2024_104247
crossref_primary_10_1109_TIP_2021_3080177
crossref_primary_10_1186_s42492_024_00176_5
crossref_primary_10_1109_TMM_2023_3294820
crossref_primary_10_1016_j_apor_2024_104358
crossref_primary_10_1109_THMS_2022_3219242
crossref_primary_10_1109_TMM_2022_3227472
crossref_primary_10_1109_ACCESS_2020_3026276
crossref_primary_10_1007_s00530_025_01790_w
crossref_primary_10_1109_TNNLS_2022_3152251
crossref_primary_10_1016_j_displa_2024_102805
crossref_primary_10_3390_s20216330
crossref_primary_10_1016_j_autcon_2024_105561
crossref_primary_10_1049_el_2017_3830
crossref_primary_10_3390_s25082409
crossref_primary_10_1016_j_knosys_2020_105643
crossref_primary_10_1109_TMM_2024_3410514
crossref_primary_10_1007_s11042_022_13079_5
crossref_primary_10_1007_s11432_022_3859_8
crossref_primary_10_3390_electronics11101604
crossref_primary_10_1007_s10489_025_06328_1
crossref_primary_10_1007_s11390_017_1742_y
crossref_primary_10_1109_TIM_2025_3551025
crossref_primary_10_1109_TCSVT_2024_3440488
crossref_primary_10_1016_j_neucom_2024_128049
crossref_primary_10_1177_1420326X231155112
crossref_primary_10_1016_j_neucom_2019_05_034
crossref_primary_10_1109_TCYB_2019_2904901
crossref_primary_10_1177_0018720821995000
crossref_primary_10_1016_j_patcog_2023_109806
crossref_primary_10_1109_ACCESS_2023_3326343
crossref_primary_10_3390_robotics9020033
crossref_primary_10_1016_j_net_2025_103631
crossref_primary_10_1007_s44336_024_00002_9
crossref_primary_10_1016_j_cag_2024_103894
crossref_primary_10_1016_j_patcog_2017_04_017
crossref_primary_10_1016_j_displa_2022_102225
crossref_primary_10_1109_TPAMI_2023_3330935
crossref_primary_10_1145_3665869
crossref_primary_10_1145_3377552
crossref_primary_10_1016_j_jvcir_2023_103908
crossref_primary_10_1007_s00371_022_02453_x
crossref_primary_10_1109_TPAMI_2023_3234212
crossref_primary_10_3390_s22207975
crossref_primary_10_1016_j_asoc_2025_113909
crossref_primary_10_3390_electronics11203403
crossref_primary_10_1016_j_gaitpost_2021_03_003
crossref_primary_10_1016_j_eswa_2024_126114
crossref_primary_10_7554_eLife_48571
crossref_primary_10_1109_TITS_2023_3306314
crossref_primary_10_1145_3579359
crossref_primary_10_1109_TCSVT_2019_2928813
crossref_primary_10_1016_j_neucom_2023_126827
crossref_primary_10_1111_cgf_14028
crossref_primary_10_3390_s24206729
crossref_primary_10_1088_1742_6596_2562_1_012067
crossref_primary_10_1109_LRA_2019_2895266
crossref_primary_10_1109_TPAMI_2022_3170353
crossref_primary_10_1016_j_neucom_2022_06_033
crossref_primary_10_1109_LRA_2024_3401116
crossref_primary_10_1109_TCSVT_2023_3287329
crossref_primary_10_1016_j_procs_2019_11_073
crossref_primary_10_1109_ACCESS_2024_3399222
crossref_primary_10_1007_s44443_025_00023_4
crossref_primary_10_3390_sym15010025
crossref_primary_10_1007_s10489_022_03312_x
crossref_primary_10_1049_ccs_2020_0008
crossref_primary_10_1007_s00371_024_03641_7
crossref_primary_10_1109_ACCESS_2025_3563898
crossref_primary_10_1007_s11263_019_01270_5
crossref_primary_10_1109_LRA_2022_3192629
crossref_primary_10_1007_s00371_023_02957_0
crossref_primary_10_1117_1_JEI_32_2_020701
crossref_primary_10_1109_TIP_2024_3393716
crossref_primary_10_1109_TETCI_2023_3257262
crossref_primary_10_1145_3414685_3417838
crossref_primary_10_1016_j_gmod_2024_101235
crossref_primary_10_1186_s13640_023_00612_1
crossref_primary_10_3390_s22124642
crossref_primary_10_1007_s00371_025_04175_2
crossref_primary_10_1109_ACCESS_2023_3305638
crossref_primary_10_2174_1574893618666230508105440
crossref_primary_10_1007_s11042_022_13921_w
crossref_primary_10_1109_TPAMI_2021_3087695
crossref_primary_10_1007_s00371_024_03763_y
crossref_primary_10_1109_ACCESS_2024_3444790
crossref_primary_10_1145_3180420
crossref_primary_10_1016_j_gmod_2024_101222
crossref_primary_10_1109_TCDS_2022_3185146
crossref_primary_10_1109_TPAMI_2021_3138762
crossref_primary_10_1109_LRA_2024_3354628
crossref_primary_10_1038_s41597_022_01802_8
crossref_primary_10_1109_TCSVT_2024_3410400
crossref_primary_10_3390_s21072415
crossref_primary_10_3390_fi12070111
crossref_primary_10_1016_j_dib_2025_111579
crossref_primary_10_3390_app14167234
crossref_primary_10_1109_TNSRE_2025_3589765
crossref_primary_10_1016_j_imavis_2024_105311
crossref_primary_10_1145_3570731
crossref_primary_10_1109_TMM_2023_3321438
crossref_primary_10_1109_TPAMI_2023_3341630
crossref_primary_10_1007_s00138_023_01447_6
crossref_primary_10_1007_s11042_023_14440_y
crossref_primary_10_1109_TIP_2023_3275914
crossref_primary_10_1089_big_2016_0028
crossref_primary_10_1002_cav_2266
crossref_primary_10_1109_TMM_2023_3323874
crossref_primary_10_1109_TITS_2024_3486214
crossref_primary_10_1109_TIP_2020_3018224
crossref_primary_10_3390_data7060079
crossref_primary_10_1109_ACCESS_2023_3307620
crossref_primary_10_1007_s00530_024_01451_4
crossref_primary_10_1109_TCSVT_2024_3408332
crossref_primary_10_1007_s10489_022_04419_x
crossref_primary_10_3390_s21248208
crossref_primary_10_1007_s00530_023_01216_5
crossref_primary_10_1007_s12652_023_04629_2
crossref_primary_10_1016_j_patcog_2020_107416
crossref_primary_10_3390_s25133944
crossref_primary_10_1038_s41597_024_04077_3
crossref_primary_10_1109_TIP_2021_3089380
crossref_primary_10_1007_s00530_021_00880_9
crossref_primary_10_1109_ACCESS_2023_3269848
crossref_primary_10_1016_j_patrec_2025_08_014
crossref_primary_10_1016_j_patcog_2025_111885
crossref_primary_10_1016_j_conb_2022_02_002
crossref_primary_10_1109_TVCG_2024_3386923
crossref_primary_10_1109_TNSRE_2022_3150392
crossref_primary_10_1109_TPAMI_2023_3271691
crossref_primary_10_1016_j_patcog_2025_111407
crossref_primary_10_1145_3592788
crossref_primary_10_1007_s11263_023_01839_1
crossref_primary_10_1016_j_ergon_2021_103218
crossref_primary_10_1061_JCEMD4_COENG_14266
crossref_primary_10_1007_s11263_018_1066_6
crossref_primary_10_1109_TMM_2023_3310330
crossref_primary_10_3390_app112411938
crossref_primary_10_1038_s41598_024_79373_4
crossref_primary_10_3390_app13042700
crossref_primary_10_1016_j_knosys_2025_113272
crossref_primary_10_3390_app15073737
crossref_primary_10_1109_TAI_2022_3164065
crossref_primary_10_1145_3414685_3417877
crossref_primary_10_1016_j_neunet_2024_106237
crossref_primary_10_1007_s00371_021_02120_7
crossref_primary_10_3390_s19010056
crossref_primary_10_1109_TPAMI_2022_3164344
crossref_primary_10_1007_s00371_025_04040_2
crossref_primary_10_1016_j_imavis_2017_02_002
crossref_primary_10_1109_TMM_2022_3180218
crossref_primary_10_1016_j_cviu_2024_104051
crossref_primary_10_1007_s11263_023_01770_5
crossref_primary_10_3390_s25030834
crossref_primary_10_1145_3386569_3392410
crossref_primary_10_1186_s13019_024_02558_5
crossref_primary_10_1016_j_fmre_2023_02_006
crossref_primary_10_1007_s00530_023_01105_x
crossref_primary_10_1109_TMM_2025_3535349
crossref_primary_10_1007_s00530_021_00807_4
crossref_primary_10_1007_s11042_016_3312_7
crossref_primary_10_1016_j_patcog_2023_110175
crossref_primary_10_1109_TVCG_2020_2988476
crossref_primary_10_1016_j_cviu_2017_01_011
crossref_primary_10_1109_LRA_2022_3151614
crossref_primary_10_1109_TDSC_2024_3376437
crossref_primary_10_1109_TPAMI_2025_3566420
crossref_primary_10_1016_j_displa_2025_102972
crossref_primary_10_1016_j_imavis_2025_105636
crossref_primary_10_1007_s10489_023_04908_7
crossref_primary_10_1186_s42492_022_00112_5
crossref_primary_10_3390_fractalfract9090603
crossref_primary_10_1016_j_engappai_2025_110305
crossref_primary_10_3390_ai5040139
crossref_primary_10_1145_3130800_3130883
crossref_primary_10_3390_s21072464
crossref_primary_10_1007_s11760_025_04145_0
crossref_primary_10_1109_TPAMI_2021_3136136
crossref_primary_10_1016_j_dib_2024_110157
crossref_primary_10_1080_21681163_2022_2072394
crossref_primary_10_2196_55476
crossref_primary_10_1109_TIP_2022_3154606
crossref_primary_10_1016_j_cviu_2024_104233
crossref_primary_10_1109_TVCG_2023_3326932
crossref_primary_10_1109_TPAMI_2022_3188716
crossref_primary_10_1109_TNSRE_2023_3324960
crossref_primary_10_1016_j_aej_2024_12_097
crossref_primary_10_1016_j_patcog_2015_11_019
crossref_primary_10_1016_j_cmpb_2022_107057
crossref_primary_10_1007_s11263_019_01241_w
crossref_primary_10_1016_j_neucom_2022_04_047
crossref_primary_10_1109_TCSVT_2022_3180737
crossref_primary_10_3390_s24072314
crossref_primary_10_1016_j_diin_2019_03_006
crossref_primary_10_1145_3626240
crossref_primary_10_1109_ACCESS_2019_2904117
crossref_primary_10_1007_s10462_024_10741_2
crossref_primary_10_1049_ipr2_12277
crossref_primary_10_1371_journal_pone_0293917
crossref_primary_10_1109_JIOT_2025_3535156
crossref_primary_10_1145_3528223_3530090
crossref_primary_10_1109_TMM_2020_3029941
crossref_primary_10_1016_j_autcon_2025_106015
crossref_primary_10_1109_LRA_2023_3312035
crossref_primary_10_1007_s42979_019_0025_9
crossref_primary_10_1109_TMM_2025_3535370
crossref_primary_10_3390_electronics12214382
crossref_primary_10_1109_TCSVT_2023_3286402
crossref_primary_10_1007_s10489_022_03341_6
crossref_primary_10_1080_02640414_2022_2117474
crossref_primary_10_1109_TIP_2017_2757280
crossref_primary_10_3390_app12125967
crossref_primary_10_3390_sym12071116
crossref_primary_10_1016_j_patcog_2024_110446
crossref_primary_10_3390_s23073639
crossref_primary_10_1002_cav_2084
crossref_primary_10_1016_j_cviu_2024_104258
crossref_primary_10_1016_j_knosys_2024_111810
crossref_primary_10_1109_TPAMI_2022_3205910
crossref_primary_10_1109_ACCESS_2023_3325291
crossref_primary_10_3390_s19204603
crossref_primary_10_1080_0952813X_2023_2241575
crossref_primary_10_1016_j_imavis_2025_105425
crossref_primary_10_1002_adma_202206638
crossref_primary_10_1109_TPAMI_2022_3194167
crossref_primary_10_1007_s00138_023_01410_5
crossref_primary_10_1145_3407659
crossref_primary_10_3390_systems11040175
crossref_primary_10_3390_app13020966
crossref_primary_10_1109_ACCESS_2021_3062426
crossref_primary_10_1016_j_ssci_2023_106150
crossref_primary_10_3390_s22062300
crossref_primary_10_1109_TPAMI_2019_2901875
crossref_primary_10_1109_TPAMI_2024_3450537
crossref_primary_10_1016_j_autcon_2025_106476
crossref_primary_10_1016_j_patrec_2021_03_028
crossref_primary_10_1007_s40747_023_01174_5
crossref_primary_10_1007_s00371_024_03298_2
crossref_primary_10_1109_TIM_2023_3292957
crossref_primary_10_3390_s23063290
crossref_primary_10_3390_app14041646
crossref_primary_10_1016_j_ergon_2022_103354
crossref_primary_10_1007_s11263_016_0903_8
crossref_primary_10_1007_s12065_023_00827_1
crossref_primary_10_1016_j_patcog_2023_109497
crossref_primary_10_1145_3749514
crossref_primary_10_3390_electronics13152926
crossref_primary_10_3390_s22197144
crossref_primary_10_1109_TCYB_2020_3013136
crossref_primary_10_3390_s22072573
crossref_primary_10_1038_s41598_024_75782_7
crossref_primary_10_1145_3657632
crossref_primary_10_1016_j_neucom_2024_128947
crossref_primary_10_1007_s00371_023_02798_x
crossref_primary_10_1109_TIP_2022_3182269
crossref_primary_10_3390_s23063057
crossref_primary_10_1109_ACCESS_2023_3344658
crossref_primary_10_1038_s41593_020_00734_z
crossref_primary_10_1016_j_dsp_2022_103628
crossref_primary_10_1007_s11042_015_2608_3
crossref_primary_10_1109_TIM_2024_3440376
crossref_primary_10_1016_j_cviu_2018_10_006
crossref_primary_10_1007_s11263_021_01526_z
crossref_primary_10_1109_TPAMI_2021_3070002
crossref_primary_10_1007_s11760_024_03491_9
crossref_primary_10_1007_s42979_023_02195_0
crossref_primary_10_1016_j_cviu_2023_103715
crossref_primary_10_4018_IJSWIS_368221
crossref_primary_10_1007_s11263_025_02574_5
crossref_primary_10_3390_app11104689
crossref_primary_10_1016_j_robot_2024_104677
crossref_primary_10_1109_LRA_2025_3537860
crossref_primary_10_1109_TCSVT_2023_3255186
crossref_primary_10_1007_s11263_024_02074_y
crossref_primary_10_1016_j_jmsy_2025_02_018
crossref_primary_10_1109_ACCESS_2024_3394209
crossref_primary_10_1109_JIOT_2024_3476350
crossref_primary_10_1109_TPAMI_2018_2828427
crossref_primary_10_3390_machines13080672
crossref_primary_10_1109_TPAMI_2017_2772922
crossref_primary_10_1016_j_neucom_2025_131247
crossref_primary_10_1145_3450626_3459825
crossref_primary_10_1007_s11263_022_01599_4
crossref_primary_10_1109_TIP_2021_3136613
crossref_primary_10_1109_TPAMI_2020_2983935
crossref_primary_10_1145_3548555_3548560
crossref_primary_10_1016_j_neucom_2022_11_097
crossref_primary_10_1016_j_knosys_2024_112823
crossref_primary_10_1016_j_physa_2024_129600
crossref_primary_10_1109_LSP_2023_3339060
crossref_primary_10_1016_j_eswa_2024_123625
crossref_primary_10_1109_TMM_2022_3233251
crossref_primary_10_12677_AIRR_2023_122017
crossref_primary_10_1177_1071181319631174
crossref_primary_10_1109_TCSVT_2018_2871660
crossref_primary_10_3390_s23052597
crossref_primary_10_1016_j_cviu_2018_02_004
crossref_primary_10_1016_j_heliyon_2024_e27596
crossref_primary_10_1007_s11042_023_17955_6
crossref_primary_10_7717_peerj_cs_2456
crossref_primary_10_1007_s11044_025_10071_3
crossref_primary_10_1080_02640414_2025_2489868
crossref_primary_10_1007_s11042_023_15775_2
crossref_primary_10_1109_JBHI_2019_2934342
crossref_primary_10_1109_TNSRE_2023_3282675
crossref_primary_10_3390_s21113769
crossref_primary_10_1007_s11263_025_02478_4
crossref_primary_10_1109_ACCESS_2025_3566109
crossref_primary_10_1109_LSP_2024_3392686
crossref_primary_10_1016_j_pmr_2020_12_007
crossref_primary_10_1109_TPAMI_2021_3102128
crossref_primary_10_1016_j_cmpb_2023_107620
crossref_primary_10_1016_j_imavis_2025_105598
crossref_primary_10_1016_j_cag_2025_104350
crossref_primary_10_1016_j_neucom_2024_128743
crossref_primary_10_1145_3522618
crossref_primary_10_1016_j_neucom_2023_126284
crossref_primary_10_1109_TPAMI_2024_3364185
crossref_primary_10_1109_TVCG_2021_3092877
crossref_primary_10_1007_s11042_023_16170_7
crossref_primary_10_1016_j_asoc_2025_113362
crossref_primary_10_1016_j_cviu_2019_102897
crossref_primary_10_32604_cmes_2022_020857
crossref_primary_10_1016_j_cviu_2024_104070
crossref_primary_10_1109_TPAMI_2024_3355287
crossref_primary_10_1038_s41598_024_79707_2
crossref_primary_10_1007_s11263_023_01804_y
crossref_primary_10_1007_s42979_024_02784_7
crossref_primary_10_1016_j_neucom_2019_09_101
crossref_primary_10_1109_TETCI_2024_3418828
crossref_primary_10_1186_s40648_024_00280_4
crossref_primary_10_1016_j_robot_2023_104556
crossref_primary_10_1109_TIP_2020_3038362
crossref_primary_10_1109_LRA_2025_3557235
crossref_primary_10_1007_s11263_021_01483_7
crossref_primary_10_1016_j_cag_2025_104339
crossref_primary_10_3389_fnins_2023_1201088
crossref_primary_10_1016_j_inffus_2023_102090
crossref_primary_10_1016_j_patcog_2024_110955
crossref_primary_10_1145_3306346_3322978
crossref_primary_10_1109_TCSVT_2019_2953678
crossref_primary_10_1111_cgf_13947
crossref_primary_10_1109_TPAMI_2019_2892985
crossref_primary_10_3390_electronics14071307
crossref_primary_10_3390_electronics9091368
crossref_primary_10_3389_fbioe_2025_1658222
crossref_primary_10_1016_j_autcon_2019_04_004
crossref_primary_10_1126_sciadv_adp4422
crossref_primary_10_1007_s00138_022_01334_6
crossref_primary_10_1186_s40798_018_0139_y
crossref_primary_10_3390_s25144446
crossref_primary_10_1177_27723577251320237
crossref_primary_10_1016_j_cviu_2025_104381
crossref_primary_10_1007_s00371_019_01740_4
crossref_primary_10_1007_s11263_019_01279_w
crossref_primary_10_1016_j_asoc_2023_110267
crossref_primary_10_1007_s11704_024_40497_5
crossref_primary_10_1145_3368066
crossref_primary_10_1007_s00371_024_03724_5
crossref_primary_10_1109_ACCESS_2020_3045794
crossref_primary_10_1109_TBIOM_2024_3485990
crossref_primary_10_1109_TCYB_2022_3184977
crossref_primary_10_1145_3072959_3073596
crossref_primary_10_1016_j_neucom_2024_127683
crossref_primary_10_1007_s00371_024_03604_y
crossref_primary_10_3390_s24061923
crossref_primary_10_1007_s10489_024_05665_x
crossref_primary_10_1016_j_visres_2020_09_005
crossref_primary_10_1080_00140139_2024_2306315
crossref_primary_10_1109_TPAMI_2023_3243400
crossref_primary_10_1007_s11042_024_20179_x
crossref_primary_10_1134_S1064230723030061
crossref_primary_10_1109_TMM_2024_3521755
crossref_primary_10_1002_cav_70046
crossref_primary_10_1016_j_is_2025_102579
crossref_primary_10_3390_app13031466
crossref_primary_10_1007_s11263_024_02070_2
crossref_primary_10_1109_ACCESS_2024_3509447
crossref_primary_10_1007_s11263_021_01570_9
crossref_primary_10_32604_cmc_2024_047336
crossref_primary_10_1016_j_patcog_2023_109427
crossref_primary_10_1109_ACCESS_2023_3324659
crossref_primary_10_1109_LRA_2019_2953663
crossref_primary_10_1109_TIP_2025_3559788
crossref_primary_10_1016_j_apergo_2020_103138
crossref_primary_10_1109_TMM_2023_3272736
crossref_primary_10_1007_s11554_024_01528_3
crossref_primary_10_1109_TPAMI_2021_3053765
crossref_primary_10_3390_ijerph19169803
crossref_primary_10_1111_cgf_13131
crossref_primary_10_1016_j_neucom_2022_08_075
crossref_primary_10_1109_TMM_2022_3162469
crossref_primary_10_1109_TPAMI_2023_3245815
crossref_primary_10_1177_02783649241227559
crossref_primary_10_1109_TCSVT_2023_3328371
crossref_primary_10_1117_1_JEI_31_6_063044
crossref_primary_10_3390_a17100434
crossref_primary_10_3390_app13085093
crossref_primary_10_1007_s11263_018_1071_9
crossref_primary_10_3233_AIC_210250
crossref_primary_10_3390_info15080472
crossref_primary_10_3390_s22218335
crossref_primary_10_1016_j_imavis_2021_104196
crossref_primary_10_1016_j_imavis_2022_104452
crossref_primary_10_1109_JSEN_2020_2999849
crossref_primary_10_1109_TIM_2024_3373058
crossref_primary_10_3389_frai_2020_00070
crossref_primary_10_1016_j_cviu_2020_103072
crossref_primary_10_1016_j_engappai_2022_105813
crossref_primary_10_3389_fncom_2023_1145209
crossref_primary_10_1007_s11042_019_08269_7
crossref_primary_10_1109_TPAMI_2020_3025327
crossref_primary_10_1109_JSEN_2021_3115105
crossref_primary_10_1016_j_autcon_2020_103538
crossref_primary_10_1016_j_engappai_2025_111905
crossref_primary_10_1016_j_patcog_2023_109631
crossref_primary_10_1038_s41597_024_03312_1
crossref_primary_10_1177_20539517211013569
crossref_primary_10_1109_TII_2018_2864824
crossref_primary_10_1109_JBHI_2022_3144917
crossref_primary_10_1007_s10462_023_10687_x
crossref_primary_10_1016_j_eswa_2020_113794
crossref_primary_10_1016_j_patcog_2024_110925
crossref_primary_10_1016_j_jvcir_2023_103955
crossref_primary_10_1145_3653455
crossref_primary_10_1109_TCSVT_2023_3318557
crossref_primary_10_3390_electronics12234830
crossref_primary_10_1109_LRA_2020_3043167
crossref_primary_10_1038_s41592_024_02319_1
crossref_primary_10_1109_TPAMI_2018_2816031
crossref_primary_10_1016_j_icte_2024_09_015
crossref_primary_10_1109_JIOT_2024_3425483
crossref_primary_10_12688_digitaltwin_17404_2
crossref_primary_10_1007_s00371_021_02238_8
crossref_primary_10_1109_TIP_2024_3515872
crossref_primary_10_1016_j_neucom_2025_130306
crossref_primary_10_11922_11_6035_csd_2024_0125_zh
crossref_primary_10_1016_j_rcim_2025_103012
crossref_primary_10_1007_s11263_022_01714_5
crossref_primary_10_1109_JIOT_2020_3014930
crossref_primary_10_1109_TPAMI_2020_3029700
crossref_primary_10_1088_1742_6596_2283_1_012007
crossref_primary_10_1109_TNNLS_2022_3166861
crossref_primary_10_1007_s10462_024_11019_3
crossref_primary_10_1016_j_mechatronics_2022_102833
crossref_primary_10_1109_ACCESS_2019_2933221
crossref_primary_10_1016_j_gaitpost_2022_03_008
crossref_primary_10_1016_j_neunet_2023_10_038
crossref_primary_10_1016_j_neucom_2022_07_002
crossref_primary_10_1016_j_media_2021_102179
crossref_primary_10_1109_TCSVT_2020_3038145
crossref_primary_10_1016_j_procs_2023_08_177
crossref_primary_10_1109_ACCESS_2022_3191644
crossref_primary_10_1109_TCSVT_2022_3163782
crossref_primary_10_3389_fspor_2021_809898
crossref_primary_10_1109_ACCESS_2023_3271285
crossref_primary_10_1109_TIM_2024_3381701
crossref_primary_10_1016_j_asoc_2024_112126
crossref_primary_10_1111_cgf_14426
crossref_primary_10_3390_electronics10091118
crossref_primary_10_3390_technologies12070096
crossref_primary_10_1109_ACCESS_2023_3283495
crossref_primary_10_1007_s00371_024_03744_1
crossref_primary_10_1109_TAI_2024_3432028
crossref_primary_10_1016_j_neucom_2021_12_007
crossref_primary_10_1109_TPAMI_2024_3511393
crossref_primary_10_1038_s41597_022_01722_7
crossref_primary_10_1007_s00371_023_03142_z
crossref_primary_10_1088_1742_6596_2833_1_012002
crossref_primary_10_1109_TPAMI_2021_3124736
crossref_primary_10_1371_journal_pone_0214499
crossref_primary_10_1111_mice_13500
crossref_primary_10_3390_electronics12194120
crossref_primary_10_3390_s23104899
crossref_primary_10_1109_TITS_2020_2988504
crossref_primary_10_1109_TMM_2023_3347095
crossref_primary_10_1111_cgf_14634
crossref_primary_10_1109_TPAMI_2025_3552604
crossref_primary_10_1109_TPAMI_2025_3528979
crossref_primary_10_1109_TETCI_2023_3318985
crossref_primary_10_1109_TVCG_2024_3363457
crossref_primary_10_1016_j_engappai_2024_109981
crossref_primary_10_1007_s11760_025_04741_0
crossref_primary_10_1007_s11263_016_0962_x
crossref_primary_10_1007_s11263_018_1118_y
crossref_primary_10_1016_j_neucom_2024_127272
crossref_primary_10_1109_JBHI_2023_3257662
crossref_primary_10_1109_TIP_2022_3187294
crossref_primary_10_1109_TCAD_2024_3453188
crossref_primary_10_1016_j_cviu_2022_103489
crossref_primary_10_1016_j_cviu_2022_103483
crossref_primary_10_1016_j_cviu_2023_103780
crossref_primary_10_1111_mice_13515
crossref_primary_10_3390_electronics14091773
crossref_primary_10_1007_s11042_023_14556_1
crossref_primary_10_1109_LRA_2022_3188892
crossref_primary_10_1109_TPAMI_2025_3574845
Cites_doi 10.1109/ICCV.2009.5459303
10.7551/mitpress/4908.001.0001
10.1007/s11263-009-0273-6
10.1007/978-1-4020-6693-1
10.1145/1186822.1073207
10.1109/ICCV.2011.6126375
10.1109/CVPR.2011.5995316
10.1109/ICCV.2013.163
10.1214/009053607000000677
10.1007/978-3-642-33765-9_48
10.1109/CVPR.2012.6248052
10.1109/ICCV.2003.1238424
10.1145/1102351.1102371
10.1109/CVPR.2007.383341
10.1109/CVPR.2006.315
10.1109/CVPR.2003.1211339
10.1007/s11263-008-0204-y
10.1109/MCG.2007.68
10.1109/CVPR.2004.1315258
10.1109/TPAMI.2007.1111
10.1109/ICCV.2011.6126500
10.1145/1964921.1964926
10.1109/CVPR.2006.169
10.1109/TPAMI.2013.47
10.1561/0600000005
10.1145/1143844.1143857
10.1109/ICCV.2005.193
10.1007/978-0-85729-997-0
10.1162/089976698300017467
10.1007/s11263-008-0173-1
10.1023/B:VISI.0000042935.43630.46
10.1145/2185520.2185531
10.1109/TPAMI.2006.21
10.1109/CVPR.2001.990509
10.1145/1275808.1276421
10.1109/ICCV.2001.937545
10.1109/CVPR.2004.1315220
10.1109/CVPR.2000.854758
10.1007/978-1-4020-6693-1_8
10.1109/CVPR.2004.1315063
10.1109/CVPR.2010.5540141
10.1109/CVPR.2011.5995519
10.1109/CVPR.2008.4587546
10.1109/CVPR.2010.5540156
ContentType Journal Article
Copyright 2015 INIST-CNRS
Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Jul 2014
Copyright_xml – notice: 2015 INIST-CNRS
– notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Jul 2014
CorporateAuthor Matematik LTH
Lunds universitet
Naturvetenskapliga fakulteten
Profile areas and other strong research environments
Faculty of Science
Lund University
ELLIIT: the Linköping-Lund initiative on IT and mobile communication
Strategiska forskningsområden (SFO)
Centre for Mathematical Sciences
Strategic research areas (SRA)
Mathematics (Faculty of Engineering)
Matematikcentrum
eSSENCE: The e-Science Collaboration
Profilområden och andra starka forskningsmiljöer
CorporateAuthor_xml – name: Naturvetenskapliga fakulteten
– name: ELLIIT: the Linköping-Lund initiative on IT and mobile communication
– name: Strategiska forskningsområden (SFO)
– name: Mathematics (Faculty of Engineering)
– name: Strategic research areas (SRA)
– name: Faculty of Science
– name: Lunds universitet
– name: Profilområden och andra starka forskningsmiljöer
– name: Lund University
– name: Matematik LTH
– name: Centre for Mathematical Sciences
– name: Profile areas and other strong research environments
– name: eSSENCE: The e-Science Collaboration
– name: Matematikcentrum
DBID 97E
RIA
RIE
AAYXX
CITATION
IQODW
CGR
CUY
CVF
ECM
EIF
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ADTPV
AOWAS
D95
DOI 10.1109/TPAMI.2013.248
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE
CrossRef
Pascal-Francis
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
SwePub
SwePub Articles
SWEPUB Lunds universitet
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList
MEDLINE - Academic

MEDLINE
Technology Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
Applied Sciences
EISSN 2160-9292
1939-3539
EndPage 1339
ExternalDocumentID oai_portal_research_lu_se_publications_6a729909_19b5_4879_9079_1140039117b3
3442183991
26353306
28603589
10_1109_TPAMI_2013_248
6682899
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID ---
-DZ
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
9M8
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ADRHT
AENEX
AETEA
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
FA8
HZ~
H~9
IBMZZ
ICLAB
IEDLZ
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNI
RNS
RXW
RZB
TAE
TN5
UHB
VH1
XJT
~02
AAYXX
CITATION
AAYOK
IQODW
RIG
CGR
CUY
CVF
ECM
EIF
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ADTPV
AOWAS
D95
ID FETCH-LOGICAL-c529t-3b66694a34b1b8b30f986898ae4bf0709fdfac71b9193d80b4b4c860689e9dd13
IEDL.DBID RIE
ISICitedReferencesCount 2493
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000338209900004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0162-8828
1939-3539
IngestDate Fri Dec 05 01:30:38 EST 2025
Thu Oct 02 14:35:14 EDT 2025
Sun Nov 09 07:52:56 EST 2025
Mon Jul 21 05:51:17 EDT 2025
Wed Apr 02 07:15:03 EDT 2025
Tue Nov 18 22:34:45 EST 2025
Sat Nov 29 05:15:56 EST 2025
Tue Aug 26 16:49:20 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 7
Keywords large-scale learning
structured prediction
articulated body modeling
optimization
human motion capture data
3D human pose estimation
Fourier kernel approximations
Occlusion
Very large databases
Modeling
Optimization
Posture
Natural environment
Scientific research
Classification
Internal structure
Data structure
Learning algorithm
Computer vision
Statistical analysis
Motion estimation
Viewing angle
Computer animation
Augmented reality
Educational software program
Human activity
Time of flight method
Mixed reality
Data visualization
Body movement
Large scale
Occultation
Structural analysis
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
CC BY 4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c529t-3b66694a34b1b8b30f986898ae4bf0709fdfac71b9193d80b4b4c860689e9dd13
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
PMID 26353306
PQID 1564736274
PQPubID 85458
PageCount 15
ParticipantIDs crossref_citationtrail_10_1109_TPAMI_2013_248
pascalfrancis_primary_28603589
ieee_primary_6682899
swepub_primary_oai_portal_research_lu_se_publications_6a729909_19b5_4879_9079_1140039117b3
proquest_journals_1564736274
crossref_primary_10_1109_TPAMI_2013_248
pubmed_primary_26353306
proquest_miscellaneous_1711536910
PublicationCentury 2000
PublicationDate 2014-07-01
PublicationDateYYYYMMDD 2014-07-01
PublicationDate_xml – month: 07
  year: 2014
  text: 2014-07-01
  day: 01
PublicationDecade 2010
PublicationPlace Los Alamitos, CA
PublicationPlace_xml – name: Los Alamitos, CA
– name: United States
– name: New York
PublicationTitle IEEE transactions on pattern analysis and machine intelligence
PublicationTitleAbbrev TPAMI
PublicationTitleAlternate IEEE Trans Pattern Anal Mach Intell
PublicationYear 2014
Publisher IEEE
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: IEEE Computer Society
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref57
ref56
ref12
ref15
ref58
ref14
ref53
ref55
ref11
forsyth (ref59) 2006; 1
bo (ref13) 0
ref17
ref19
ref18
shakhnarovich (ref49) 2006
rosales (ref10) 0
sidenbladh (ref16) 0
ref51
liu (ref25) 2013; 35
ref50
ref46
ref42
ref41
ref44
rahimi (ref47) 0
ref7
ref4
rosenhahn (ref2) 2008; 36
ref6
ref5
li (ref48) 0
ref40
ref37
ref36
ref31
ref30
ref32
bakir (ref54) 0
ref1
ref39
ref38
ferrari (ref33) 0
(ref28) 0
hartley (ref45) 2000
ref23
ref20
sigal (ref24) 0
sapp (ref34) 0
ref22
eichner (ref35) 0
weston (ref52) 0
(ref8) 0
ref27
ref29
li (ref21) 0
sminchisescu (ref3) 2008
ref60
sminchisescu (ref9) 2011
ref61
guan (ref26) 2012; 31
agarwal (ref43) 0
References_xml – ident: ref31
  doi: 10.1109/ICCV.2009.5459303
– start-page: 50
  year: 0
  ident: ref43
  article-title: A local basis representation for estimating human pose from cluttered images
  publication-title: Proc ACCV
– year: 2006
  ident: ref49
  publication-title: Nearest-Neighbors methods in Learning and Vision Theory and Practice
  doi: 10.7551/mitpress/4908.001.0001
– start-page: 406
  year: 0
  ident: ref34
  article-title: Cascaded models for articulated pose estimation
  publication-title: Proc European Conf Computer Vision
– ident: ref4
  doi: 10.1007/s11263-009-0273-6
– volume: 36
  year: 2008
  ident: ref2
  publication-title: Human Motion - Understanding, Modeling, Capture and Animation
  doi: 10.1007/978-1-4020-6693-1
– ident: ref27
  doi: 10.1145/1186822.1073207
– ident: ref7
  doi: 10.1109/ICCV.2011.6126375
– year: 2011
  ident: ref9
  publication-title: Feature-based Human Pose Estimation in Guide to Visual Analysis of Humans Looking at People
– ident: ref40
  doi: 10.1109/CVPR.2011.5995316
– ident: ref55
  doi: 10.1109/ICCV.2013.163
– ident: ref51
  doi: 10.1214/009053607000000677
– ident: ref5
  doi: 10.1007/978-3-642-33765-9_48
– ident: ref44
  doi: 10.1109/CVPR.2012.6248052
– ident: ref29
  doi: 10.1109/ICCV.2003.1238424
– year: 0
  ident: ref28
  publication-title: INRIA 4D
– ident: ref46
  doi: 10.1145/1102351.1102371
– ident: ref42
  doi: 10.1109/CVPR.2007.383341
– ident: ref32
  doi: 10.1109/CVPR.2006.315
– year: 0
  ident: ref21
  article-title: Monocular tracking of 3D human motion with a coordianted mixture of factor analyzers
  publication-title: Proc Eur Conf Computer Vision
– ident: ref18
  doi: 10.1109/CVPR.2003.1211339
– year: 0
  ident: ref33
  article-title: Pose seach: Retrieving people using their pose
  publication-title: Proc IEEE Int Conf Computer Vision Pattern Recognition
– year: 0
  ident: ref16
  article-title: Stochastic tracking of 3D human figures using 2D image motion
  publication-title: Proc European Conf Computer Vision
– ident: ref14
  doi: 10.1007/s11263-008-0204-y
– ident: ref61
  doi: 10.1109/MCG.2007.68
– ident: ref30
  doi: 10.1109/CVPR.2004.1315258
– ident: ref12
  doi: 10.1109/TPAMI.2007.1111
– ident: ref38
  doi: 10.1109/ICCV.2011.6126500
– year: 0
  ident: ref13
  article-title: Structured output-associative regression
  publication-title: Proc IEEE Int Conf Computer Vision Pattern Recognition
– ident: ref37
  doi: 10.1145/1964921.1964926
– year: 2000
  ident: ref45
  publication-title: Multiple View Geometry in Computer Vision
– ident: ref41
  doi: 10.1109/CVPR.2006.169
– volume: 35
  start-page: 2720
  year: 2013
  ident: ref25
  article-title: Markerless motion capture of multiple characters using multi-view image segmentation
  publication-title: IEEE Trans Pattern Anal Mach Intell
  doi: 10.1109/TPAMI.2013.47
– volume: 1
  start-page: 77
  year: 2006
  ident: ref59
  article-title: Computational studies of human motion: Part 1, tracking and motion synthesis
  publication-title: Found Trends Comput Graph Vis
  doi: 10.1561/0600000005
– ident: ref50
  doi: 10.1145/1143844.1143857
– start-page: 2424
  year: 0
  ident: ref48
  article-title: Chebyshev approximations to the histogram \(\chi ^2\) kernel
  publication-title: Proc IEEE Int Conf Computer Vision Pattern Recognition
– ident: ref58
  doi: 10.1109/ICCV.2005.193
– start-page: 1177
  year: 0
  ident: ref47
  article-title: Random features for large-scale kernel machines
  publication-title: Proc NIPS
– ident: ref1
  doi: 10.1007/978-0-85729-997-0
– ident: ref53
  doi: 10.1162/089976698300017467
– ident: ref23
  doi: 10.1007/s11263-008-0173-1
– ident: ref19
  doi: 10.1023/B:VISI.0000042935.43630.46
– volume: 31
  start-page: 35:10
  year: 2012
  ident: ref26
  article-title: Drape: Dressing any person
  publication-title: ACM Trans Graphics
  doi: 10.1145/2185520.2185531
– year: 0
  ident: ref8
  publication-title: CMU HMC
– ident: ref11
  doi: 10.1109/TPAMI.2006.21
– ident: ref17
  doi: 10.1109/CVPR.2001.990509
– ident: ref6
  doi: 10.1145/1275808.1276421
– ident: ref60
  doi: 10.1109/ICCV.2001.937545
– ident: ref56
  doi: 10.1109/CVPR.2004.1315220
– year: 0
  ident: ref54
  article-title: Learning to find pre-images
  publication-title: Proc NIPS
– start-page: 228
  year: 0
  ident: ref35
  article-title: We are family: Joint pose estimation of multiple persons
  publication-title: Proc European Conf Computer Vision
– start-page: 1337
  year: 0
  ident: ref24
  article-title: Combined discriminative and generative articulated pose and non-rigid shape estimation
  publication-title: Proc NIPS
– ident: ref15
  doi: 10.1109/CVPR.2000.854758
– year: 2008
  ident: ref3
  article-title: 3D human motion analysis in monocular video: Techniques and challenges
  publication-title: Human Motion
  doi: 10.1007/978-1-4020-6693-1_8
– ident: ref20
  doi: 10.1109/CVPR.2004.1315063
– ident: ref39
  doi: 10.1109/CVPR.2010.5540141
– ident: ref36
  doi: 10.1109/CVPR.2011.5995519
– ident: ref57
  doi: 10.1109/CVPR.2008.4587546
– ident: ref22
  doi: 10.1109/CVPR.2010.5540156
– start-page: 873
  year: 0
  ident: ref52
  article-title: Kernel dependency estimation
  publication-title: Proc NIPS
– year: 0
  ident: ref10
  article-title: Learning body pose via specialized maps
  publication-title: Proc NIPS
SSID ssj0014503
Score 2.6695495
Snippet We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4...
We introduce a new dataset, Human3.6M, of 3.6 Million 3D Human poses, acquired by recording the performance of 11 subjects, under 4 different viewpoints, for...
SourceID swepub
proquest
pubmed
pascalfrancis
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1325
SubjectTerms Algorithms
Applied sciences
Artificial intelligence
Biometry - methods
Cameras
Computer science; control theory; systems
Computer systems and distributed systems. User interface
Databases, Factual
Ecosystem
Estimation
Exact sciences and technology
Humans
Image Enhancement - methods
Image Interpretation, Computer-Assisted - methods
Imaging, Three-Dimensional - methods
Information Storage and Retrieval - methods
Information systems. Data bases
Joints
Learning and adaptive systems
Matematik
Mathematical Sciences
Memory organisation. Data processing
Modeling and recovery of physical attributes
Motion
Natural Sciences
Naturvetenskap
Pattern Recognition, Automated - methods
Pattern recognition. Digital image processing. Computational geometry
Photography - methods
Posture
Reproducibility of Results
Sensitivity and Specificity
Sensors
Software
Solid modeling
Subtraction Technique
Three-dimensional displays
Training
Whole Body Imaging - methods
Title Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments
URI https://ieeexplore.ieee.org/document/6682899
https://www.ncbi.nlm.nih.gov/pubmed/26353306
https://www.proquest.com/docview/1564736274
https://www.proquest.com/docview/1711536910
Volume 36
WOSCitedRecordID wos000338209900004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2160-9292
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014503
  issn: 0162-8828
  databaseCode: RIE
  dateStart: 19790101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED9tEw_wsMHGR2BURkLihXRJnA-bt4ltAmmtKjFQxYtlxzZMmtKpafn7uXPS0KHxwEtVyZeqzt35fmeffwfwlkIIt9bGWeFot8qVsfGahyqL1KfWV6Hz3LfLajoV87mc7cD74S6Mcy4Un7kxfQ1n-XZRr2mr7KQsQ36wC7tVVXV3tYYTg7wIXZARwaCHo1hP0Jgm8uRqdjr5TFVcfJzl1KCPCFgwjy_vxKLQXIVKI3WLb8d3bS3uw51_kYqGQHRx8H9TeAz7PeBkp52FPIEd1xzCwaaZA-t9-xAebTETHkEdNvf5uJx8YJdULI6CGErYmV5h2Fu1TDeWzZZ0ykPrJZuERtQtQwjM-BkLT7MvVBzf_GDXDZvqwO_Bzrfu1T2FrxfnVx8_xX0_hrguMrmKucFcR-aa5yY1wvDES1EKKbTLjcelQ3rrdV2lRiIqtCIxuclrgRmSkE5am_JnsNcsGvcCGOKUzAtuiizxeaq5cFzgcmKstN4iaIog3mhG1T1ZOfXMuFEhaUmkCkpVpFSFSo3g3SB_29F0_FPyiLQySPUKiWB0R_HDeIYT4IVAgeONJajezVtFTDsVp_ZFEbwZhtFB6dRFN26xRpkKQTcvEZZF8LyzoD8_3htiBN87kxpGiPW7S8BUz_r0U92sVYt_fGs7V5W6IjwhVSpNoTD5lEom-IFZbuD-TyvDX94_41fwEN9J3pUeH8Pearl2r-FB_Wt13S5H6GJzMQou9huBrh8f
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Rb9MwED6NgQQ8MNgGBMYwEhIvpEtiJ7F5m9imTbRVJQqaeLHs2IZJU4qalN_P2UlDh8YDL1UlX6o6d-f7zj5_B_DWhxBqjImz3PrdKlvE2ikaqixSlxpXhs5zX8fldMovL8VsC94Pd2GstaH4zI7813CWbxbVym-VHRVFyA_uwN2csSztbmsNZwYsD32QEcOgj6NgT9GYJuJoPjueXPg6LjrKmG_R5ylYMJMvbkSj0F7FF0eqBt-P6xpb3IY8_6IVDaHobOf_JvEYHvWQkxx3NvIEtmy9Czvrdg6k9-5deLjBTbgHVdjep6Ni8oGMfbk4CmIwISeqxcDXNkTVhsyW_pzHr5hkElpRNwRBMKEnJDxNPvvy-Po7uarJVAWGD3K6cbNuH76cnc4_nsd9R4a4yjPRxlRjtiOYokynmmuaOMELLriyTDtcPIQzTlVlqgXiQsMTzTSrOOZIXFhhTEqfwna9qO1zIIhUMsepzrPEsVRRbinHBUUbYZxB2BRBvNaMrHq6ct8141qGtCURMihVeqVKVGoE7wb5nx1Rxz8l97xWBqleIREc3lD8MJ7hBGjOUeBgbQmyd_RGeq6dkvoGRhG8GYbRRf25i6rtYoUyJcJuWiAwi-BZZ0F_frw3xAi-dSY1jHje7y4Fkz3v0w95vZIN_vGNDV1ZqNIjCiFToXOJ6aeQIsEPzHMD-39aavri9hm_hvvn88lYji-mn17CA3w_rCtEPoDtdrmyr-Be9au9apaHwdF-A8geIX4
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Human3.6M%3A+Large+Scale+Datasets+and+Predictive+Methods+for+3D+Human+Sensing+in+Natural+Environments&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Ionescu%2C+Catalin&rft.au=Papava%2C+Dragos&rft.au=Olaru%2C+Vlad&rft.au=Sminchisescu%2C+Cristian&rft.date=2014-07-01&rft.eissn=1939-3539&rft.volume=36&rft.issue=7&rft.spage=1325&rft_id=info:doi/10.1109%2FTPAMI.2013.248&rft_id=info%3Apmid%2F26353306&rft.externalDocID=26353306
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon