Attribute-Aware Pedestrian Detection in a Crowd

Pedestrian detection is an initial step to perform outdoor scene analysis, which plays an essential role in many real-world applications. Although having enjoyed the merits of deep learning frameworks from the generic object detectors, pedestrian detection is still a very challenging task due to hea...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on multimedia Ročník 23; s. 3085 - 3097
Hlavní autoři: Zhang, Jialiang, Lin, Lixiang, Zhu, Jianke, Li, Yang, Chen, Yun-chen, Hu, Yao, Hoi, Steven C. H.
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1520-9210, 1941-0077
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Pedestrian detection is an initial step to perform outdoor scene analysis, which plays an essential role in many real-world applications. Although having enjoyed the merits of deep learning frameworks from the generic object detectors, pedestrian detection is still a very challenging task due to heavy occlusions, and highly crowded group. Generally, the conventional detectors are unable to differentiate individuals from each other effectively under such a dense environment. To tackle this critical problem, we propose an attribute-aware pedestrian detector to explicitly model people's semantic attributes in a high-level feature detection fashion. Besides the typical semantic features, center position, target's scale, and offset, we introduce a pedestrian-oriented attribute feature to encode the high-level semantic differences among the crowd. Moreover, a novel attribute-feature-based Non-Maximum Suppression (NMS) is proposed to distinguish the person from a highly overlapped group by adaptively rejecting the false-positive results in a very crowd settings. Furthermore, an enhanced ground truth target is designed to alleviate the difficulties caused by the attribute configuration, and to ease the class imbalance issue during training. Finally, we evaluate our proposed attribute-aware pedestrian detector on three benchmark datasets including CityPerson, CrowdHuman, and EuroCityPerson, and achieves the state-of-the-art results.
AbstractList Pedestrian detection is an initial step to perform outdoor scene analysis, which plays an essential role in many real-world applications. Although having enjoyed the merits of deep learning frameworks from the generic object detectors, pedestrian detection is still a very challenging task due to heavy occlusions, and highly crowded group. Generally, the conventional detectors are unable to differentiate individuals from each other effectively under such a dense environment. To tackle this critical problem, we propose an attribute-aware pedestrian detector to explicitly model people's semantic attributes in a high-level feature detection fashion. Besides the typical semantic features, center position, target's scale, and offset, we introduce a pedestrian-oriented attribute feature to encode the high-level semantic differences among the crowd. Moreover, a novel attribute-feature-based Non-Maximum Suppression (NMS) is proposed to distinguish the person from a highly overlapped group by adaptively rejecting the false-positive results in a very crowd settings. Furthermore, an enhanced ground truth target is designed to alleviate the difficulties caused by the attribute configuration, and to ease the class imbalance issue during training. Finally, we evaluate our proposed attribute-aware pedestrian detector on three benchmark datasets including CityPerson, CrowdHuman, and EuroCityPerson, and achieves the state-of-the-art results.
Author Lin, Lixiang
Chen, Yun-chen
Hu, Yao
Hoi, Steven C. H.
Zhu, Jianke
Li, Yang
Zhang, Jialiang
Author_xml – sequence: 1
  givenname: Jialiang
  orcidid: 0000-0001-5085-3771
  surname: Zhang
  fullname: Zhang, Jialiang
  email: zjialiang@zju.edu.cn
  organization: College of Computer Science, Zhejiang University, Hangzhou, China
– sequence: 2
  givenname: Lixiang
  orcidid: 0000-0001-8319-2009
  surname: Lin
  fullname: Lin, Lixiang
  email: lxlin@zju.edu.cn
  organization: College of Computer Science, Zhejiang University, Hangzhou, China
– sequence: 3
  givenname: Jianke
  orcidid: 0000-0003-1831-0106
  surname: Zhu
  fullname: Zhu, Jianke
  email: jkzhu@zju.edu.cn
  organization: College of Computer Science, Zhejiang University, Hangzhou 310027, China, and also with the Alibaba-Zhejiang University Joint Research Institute of Frontier Technologies, Hangzhou, China
– sequence: 4
  givenname: Yang
  orcidid: 0000-0001-9427-7665
  surname: Li
  fullname: Li, Yang
  email: liyang89@zju.edu.cn
  organization: College of Computer Science, Zhejiang University, Hangzhou, China
– sequence: 5
  givenname: Yun-chen
  surname: Chen
  fullname: Chen, Yun-chen
  email: jeanchen@smu.edu.sg
  organization: School of Information Systems, Singapore Management University, Singapore
– sequence: 6
  givenname: Yao
  surname: Hu
  fullname: Hu, Yao
  email: yaoohu@alibaba-inc.com
  organization: Company of Alibaba, Youku Cognitive, and Intelligent Lab., Beijing, China
– sequence: 7
  givenname: Steven C. H.
  orcidid: 0000-0002-4584-3453
  surname: Hoi
  fullname: Hoi, Steven C. H.
  email: chhoi@smu.edu.sg
  organization: School of Information Systems, Singapore Management University, Singapore
BookMark eNp9kD1rwzAQhkVJoUnavdDF0NnJnWxZ1hjST0hoh3QWsnwCh9ROZYXQf18Fhw4dutwdx_vexzNho7ZribFbhBkiqPlmvZ5x4DDLYigUXrAxqhxTAClHsRYcUsURrtik77cAmAuQYzZfhOCb6hAoXRyNp-Sdaupjy7TJAwWyoenapGkTkyx9d6yv2aUzu55uznnKPp4eN8uXdPX2_LpcrFLLFYbUGgIUFmvhDNZoiaikHLg0KF1ZGcerkmdkqjqXSFYWVmQGpBHOFUWZ5dmU3Q9z9777OsSL9LY7-Dau1FxIoUrgKouqYlBZ3_W9J6dtE8zp5OBNs9MI-gRHRzj6BEef4UQj_DHuffNp_Pd_lrvB0sRnfuUKyxyBZz8EH3BE
CODEN ITMUF8
CitedBy_id crossref_primary_10_1109_ACCESS_2025_3563311
crossref_primary_10_1109_TIP_2023_3307222
crossref_primary_10_1109_TMM_2022_3192729
crossref_primary_10_1109_TITS_2022_3142445
crossref_primary_10_1016_j_neucom_2022_08_026
crossref_primary_10_1016_j_patcog_2024_110539
crossref_primary_10_3390_s24144747
crossref_primary_10_1109_TPAMI_2023_3273210
crossref_primary_10_3390_electronics12081781
crossref_primary_10_1007_s40815_025_02039_4
crossref_primary_10_1109_ACCESS_2025_3599213
crossref_primary_10_1109_TMM_2022_3203870
crossref_primary_10_3233_JIFS_236811
crossref_primary_10_1109_TIM_2024_3428635
crossref_primary_10_4018_IJSWIS_345651
crossref_primary_10_3390_rs15133265
crossref_primary_10_1109_TMM_2023_3328189
crossref_primary_10_1109_ACCESS_2023_3287488
crossref_primary_10_3390_s21103312
crossref_primary_10_1109_TITS_2024_3495814
crossref_primary_10_1109_LSP_2024_3525397
crossref_primary_10_1016_j_patcog_2022_109071
crossref_primary_10_1007_s11227_022_04815_7
crossref_primary_10_1109_TMM_2023_3293333
crossref_primary_10_1109_TITS_2022_3171250
crossref_primary_10_1109_TMM_2021_3103605
crossref_primary_10_1109_LSP_2022_3215920
crossref_primary_10_3390_app13148073
crossref_primary_10_1109_TETCI_2024_3440193
crossref_primary_10_1109_TMM_2024_3381377
crossref_primary_10_1111_coin_70032
crossref_primary_10_1109_ACCESS_2022_3150988
crossref_primary_10_1109_TCSVT_2024_3383914
crossref_primary_10_1109_TMM_2024_3401549
crossref_primary_10_1109_TMM_2021_3075566
crossref_primary_10_3390_math10010139
crossref_primary_10_1007_s40747_022_00728_3
Cites_doi 10.1109/CVPR.2017.474
10.1109/CVPR42600.2020.01223
10.1007/978-3-030-01216-8_33
10.1007/978-3-319-46466-4_20
10.1109/TMM.2017.2759508
10.1109/CVPR.2017.106
10.1007/978-3-030-01219-9_39
10.1109/CVPR.2018.00255
10.1109/ICCV.2013.190
10.1109/CVPR.2013.473
10.1109/TMM.2016.2642789
10.1109/CVPR.2019.00662
10.1007/978-3-030-01264-9_48
10.1109/CVPR.2015.7299143
10.1023/A:1008162616689
10.1007/978-3-319-46448-0_2
10.1109/ICCV.2015.221
10.1109/CVPR.2010.5540111
10.1109/ICCV.2017.593
10.1109/TMM.2019.2929005
10.1109/ICCV.2019.00507
10.1109/CVPR.2019.00533
10.1109/CVPR.2015.7298784
10.1109/TPAMI.2019.2897684
10.1109/CVPR.2018.00811
10.1109/CVPR.2018.00378
10.1109/CVPR.2017.690
10.1109/TPAMI.2014.2300479
10.1109/CVPR42600.2020.01188
10.1007/978-3-030-01264-9_38
10.1007/s11263-015-0816-y
10.1109/ICCV.2017.377
10.1109/TPAMI.2009.167
10.1109/CVPR.2017.639
10.1007/978-3-030-01234-2_17
10.1007/978-3-319-46493-0_22
10.1109/TMM.2018.2829602
10.1007/978-3-030-01246-5_9
10.1007/978-3-030-01234-2_33
10.1109/CVPR.2016.90
10.1109/CVPR.2017.685
10.1109/CVPR.2018.00719
10.1109/ICCV.2017.324
10.1109/ICCV.2015.169
10.1109/CVPR.2016.350
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TMM.2020.3020691
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library (IEL) (UW System Shared)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1941-0077
EndPage 3097
ExternalDocumentID 10_1109_TMM_2020_3020691
9184102
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61831015
  funderid: 10.13039/501100001809
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
TN5
VH1
ZY4
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c291t-cae015c1d5fa1d1ceee8e4027a17f8baf2b823eabd471ec76c53a07a5ff668343
IEDL.DBID RIE
ISICitedReferencesCount 50
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000698902000011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1520-9210
IngestDate Sun Nov 09 07:17:36 EST 2025
Tue Nov 18 21:57:15 EST 2025
Sat Nov 29 03:10:05 EST 2025
Wed Aug 27 05:08:48 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c291t-cae015c1d5fa1d1ceee8e4027a17f8baf2b823eabd471ec76c53a07a5ff668343
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-4584-3453
0000-0003-1831-0106
0000-0001-8319-2009
0000-0001-9427-7665
0000-0001-5085-3771
PQID 2575980293
PQPubID 75737
PageCount 13
ParticipantIDs crossref_citationtrail_10_1109_TMM_2020_3020691
crossref_primary_10_1109_TMM_2020_3020691
ieee_primary_9184102
proquest_journals_2575980293
PublicationCentury 2000
PublicationDate 20210000
2021-00-00
20210101
PublicationDateYYYYMMDD 2021-01-01
PublicationDate_xml – year: 2021
  text: 20210000
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on multimedia
PublicationTitleAbbrev TMM
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref56
ref12
ref15
ren (ref37) 2015
ref14
ref53
zhang (ref49) 2016
ref52
ref55
ref11
ref54
nam (ref30) 2014
ouyang (ref31) 2012
ref16
ref19
ref51
ref50
ref46
ref45
ref48
ref47
ref41
ref44
ref43
ref8
ref7
ref9
ref4
ref3
ref6
ref5
ref35
zhou (ref57) 2019
ref36
tarvainen (ref42) 2017
ref33
ref32
law (ref18) 2018
ref2
ref1
ref39
ref38
kingma (ref17) 2015
ref24
ref23
ref26
liu (ref25) 2016
ref20
ref22
ref21
ref28
ref27
ref29
shao (ref40) 2018
hasan (ref10) 2020
paszke (ref34) 2017
References_xml – ident: ref51
  doi: 10.1109/CVPR.2017.474
– ident: ref4
  doi: 10.1109/CVPR42600.2020.01223
– ident: ref14
  doi: 10.1007/978-3-030-01216-8_33
– ident: ref19
  doi: 10.1007/978-3-319-46466-4_20
– ident: ref20
  doi: 10.1109/TMM.2017.2759508
– ident: ref22
  doi: 10.1109/CVPR.2017.106
– start-page: 91
  year: 2015
  ident: ref37
  article-title: Faster R-CNN: Towards real-time object detection with region proposal networks
  publication-title: Proc Adv Neural Inform Process Syst
– ident: ref53
  doi: 10.1007/978-3-030-01219-9_39
– ident: ref48
  doi: 10.1109/CVPR.2018.00255
– ident: ref29
  doi: 10.1109/ICCV.2013.190
– ident: ref38
  doi: 10.1109/CVPR.2013.473
– ident: ref21
  doi: 10.1109/TMM.2016.2642789
– ident: ref24
  doi: 10.1109/CVPR.2019.00662
– year: 2018
  ident: ref40
  article-title: Crowdhuman: A benchmark for detecting human in a crowd
  publication-title: CoRR
– ident: ref15
  doi: 10.1007/978-3-030-01264-9_48
– ident: ref44
  doi: 10.1109/CVPR.2015.7299143
– ident: ref33
  doi: 10.1023/A:1008162616689
– start-page: 21
  year: 2016
  ident: ref25
  article-title: SSD: Single shot multibox detector
  publication-title: Proc Comput Vis - ECCV 2016-14th Eur Conf part I
  doi: 10.1007/978-3-319-46448-0_2
– ident: ref43
  doi: 10.1109/ICCV.2015.221
– ident: ref7
  doi: 10.1109/CVPR.2010.5540111
– start-page: 424
  year: 2014
  ident: ref30
  article-title: Local decorrelation for improved pedestrian detection
  publication-title: Proc Advances Neural Inform Process Syst 27 Annu Conf Neural Inform Process Syst
– ident: ref1
  doi: 10.1109/ICCV.2017.593
– ident: ref54
  doi: 10.1109/TMM.2019.2929005
– start-page: 1195
  year: 2017
  ident: ref42
  article-title: Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
  publication-title: Proc Adv Neural Inform Process Syst
– ident: ref32
  doi: 10.1109/ICCV.2019.00507
– ident: ref27
  doi: 10.1109/CVPR.2019.00533
– ident: ref50
  doi: 10.1109/CVPR.2015.7298784
– ident: ref52
  doi: 10.1109/CVPR.2017.474
– ident: ref2
  doi: 10.1109/TPAMI.2019.2897684
– ident: ref47
  doi: 10.1109/CVPR.2018.00811
– ident: ref13
  doi: 10.1109/CVPR.2018.00378
– ident: ref36
  doi: 10.1109/CVPR.2017.690
– ident: ref6
  doi: 10.1109/TPAMI.2014.2300479
– ident: ref16
  doi: 10.1109/CVPR42600.2020.01188
– ident: ref26
  doi: 10.1007/978-3-030-01264-9_38
– ident: ref39
  doi: 10.1007/s11263-015-0816-y
– ident: ref55
  doi: 10.1109/ICCV.2017.377
– ident: ref8
  doi: 10.1109/TPAMI.2009.167
– year: 2015
  ident: ref17
  article-title: Adam: A method for stochastic optimization
  publication-title: Proc Int Conf Learn Representations San Diego CA USA Conf Track Proc
– ident: ref28
  doi: 10.1109/CVPR.2017.639
– ident: ref35
  doi: 10.1007/978-3-030-01234-2_17
– year: 2020
  ident: ref10
  article-title: Pedestrian detection: The elephant in the room
  publication-title: CoRR
– ident: ref3
  doi: 10.1007/978-3-319-46493-0_22
– ident: ref46
  doi: 10.1109/TMM.2018.2829602
– ident: ref56
  doi: 10.1007/978-3-030-01246-5_9
– year: 2017
  ident: ref34
  article-title: Automatic differentiation in PyTorch
  publication-title: NIPS Autodiff Workshop
– ident: ref41
  doi: 10.1007/978-3-030-01234-2_33
– start-page: 3258
  year: 2012
  ident: ref31
  article-title: A discriminative deep model for pedestrian detection with occlusion handling
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
– start-page: 443
  year: 2016
  ident: ref49
  article-title: Is faster R-CNN doing well for pedestrian detection
  publication-title: Proc Eur Conf Comput Vis
– ident: ref11
  doi: 10.1109/CVPR.2016.90
– ident: ref12
  doi: 10.1109/CVPR.2017.685
– ident: ref45
  doi: 10.1109/CVPR.2018.00719
– ident: ref23
  doi: 10.1109/ICCV.2017.324
– start-page: 734
  year: 2018
  ident: ref18
  article-title: Cornernet: Detecting objects as paired keypoints
  publication-title: Proc Eur Conf Comput Vis
– ident: ref9
  doi: 10.1109/ICCV.2015.169
– year: 2019
  ident: ref57
  article-title: Objects as points
  publication-title: CoRR
– ident: ref5
  doi: 10.1109/CVPR.2016.350
SSID ssj0014507
Score 2.5494225
Snippet Pedestrian detection is an initial step to perform outdoor scene analysis, which plays an essential role in many real-world applications. Although having...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 3085
SubjectTerms Attribute-aware
Detectors
Feature extraction
non-maximum suppression  (nms)
Object detection
pedestrian detection
Proposals
Scene analysis
Semantics
Task analysis
Training
Title Attribute-Aware Pedestrian Detection in a Crowd
URI https://ieeexplore.ieee.org/document/9184102
https://www.proquest.com/docview/2575980293
Volume 23
WOSCitedRecordID wos000698902000011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1941-0077
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014507
  issn: 1520-9210
  databaseCode: RIE
  dateStart: 19990101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1BS8MwFH7M4UEPTjfF6ZQevAjWNunWpMcxHV42dpiwW0mTFxhIla3Tv-9r1g5FEbyUHhIoX5J-70te3gdwI1Epwl7T-o5IoDBt_cxo9IkNmYnRWqmVM5sQ06lcLJJZA-52d2EQ0SWf4X356s7yzavelFtlQUJyhJWVI_eEiLd3tXYnBv2BuxpNdBT6CemY-kgyTIL5ZEJCkJM-pUecsG8U5DxVfvyIHbuMW__7rmM4qqJIb7gd9hNoYN6GVu3Q4FULtg2HX8oNdiAYFlt_K_SHH2qF3gwNOt-O3HvAwiVl5d4y95Q3InVuTuF5_DgfPfmVYYKvecIKXyskdtfMDKxihhH_oUQSiEIxYWWmLM8kj1BlhigJtYj1IFKhUANr41hG_egMmvlrjufgcY6SxYZGUViKmDDrk67hscky0bdM6S4ENYaprqqJl6YWL6lTFWGSEuppiXpaod6F212Pt20ljT_adkqUd-0qgLvQq4cprZbaOuWlxagMKWy5-L3XJRzwMhHF7Zv0oFmsNngF-_q9WK5X124WfQJuf8Pr
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1dS8MwFL0MFdQH5yfOzz74IljbpF_p41DHxG3sYYJvJU1uQJAqW6d_39usK4oi-FL6kEA5SXruSW7uAbgQKCVhr2h9ByRQmDJurhW6xIZMx2iMUNKaTSSjkXh6SsctuGruwiCiTT7D6-rVnuXrVzWvtsq8lOQIqypHrkZhyP3Fba3mzCCM7OVoIiTfTUnJLA8l_dSbDIckBTkpVHrEKftGQtZV5cev2PJLr_2_L9uGrTqOdLqLgd-BFha70F56NDj1kt2FzS8FB_fA65YLhyt0ux9yis4YNVrnjsK5xdKmZRXOc-FI54b0ud6Hx97d5Kbv1pYJruIpK10lkfhdMR0ZyTQjBkSBJBETyRIjcml4LniAMtdESqiSWEWB9BMZGRPHIgiDA1gpXgs8BIdzFCzWNI6JoZgJ85CUDY91niehYVJ1wFtimKm6nnhla_GSWV3hpxmhnlWoZzXqHbhserwtamn80XavQrlpVwPcgZPlMGX1YptlvDIZFT4FLke_9zqH9f5kOMgG96OHY9jgVVqK3UU5gZVyOsdTWFPv5fNsemZn1CfNVccy
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Attribute-Aware+Pedestrian+Detection+in+a+Crowd&rft.jtitle=IEEE+transactions+on+multimedia&rft.au=Zhang%2C+Jialiang&rft.au=Lin%2C+Lixiang&rft.au=Zhu%2C+Jianke&rft.au=Li%2C+Yang&rft.date=2021&rft.issn=1520-9210&rft.eissn=1941-0077&rft.volume=23&rft.spage=3085&rft.epage=3097&rft_id=info:doi/10.1109%2FTMM.2020.3020691&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TMM_2020_3020691
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1520-9210&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1520-9210&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1520-9210&client=summon