Adaptive Hierarchical Similarity Metric Learning with Noisy Labels

Deep Metric Learning (DML) plays a critical role in various machine learning tasks. However, most existing deep metric learning methods with binary similarity are sensitive to noisy labels, which are widely present in real-world data. Since these noisy labels often cause a severe performance degrada...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on image processing Vol. 32; p. 1
Main Authors: Yan, Jiexi, Luo, Lei, Deng, Cheng, Huang, Heng
Format: Journal Article
Language:English
Published: United States IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1057-7149, 1941-0042, 1941-0042
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Deep Metric Learning (DML) plays a critical role in various machine learning tasks. However, most existing deep metric learning methods with binary similarity are sensitive to noisy labels, which are widely present in real-world data. Since these noisy labels often cause a severe performance degradation, it is crucial to enhance the robustness and generalization ability of DML. In this paper, we propose an Adaptive Hierarchical Similarity Metric Learning method. It considers two noise-insensitive information, i.e ., class-wise divergence and sample-wise consistency. Specifically, class-wise divergence can effectively excavate richer similarity information beyond binary in modeling by taking advantage of Hyperbolic metric learning, while sample-wise consistency can further improve the generalization ability of the model using contrastive augmentation. More importantly, we design an adaptive strategy to integrate this information in a unified view. It is noteworthy that the new method can be extended to any pair-based metric loss. Extensive experimental results on benchmark datasets demonstrate that our method achieves state-of-the-art performance compared with current deep metric learning approaches.
AbstractList Deep Metric Learning (DML) plays a critical role in various machine learning tasks. However, most existing deep metric learning methods with binary similarity are sensitive to noisy labels, which are widely present in real-world data. Since these noisy labels often cause a severe performance degradation, it is crucial to enhance the robustness and generalization ability of DML. In this paper, we propose an Adaptive Hierarchical Similarity Metric Learning method. It considers two noise-insensitive information, i.e., class-wise divergence and sample-wise consistency. Specifically, class-wise divergence can effectively excavate richer similarity information beyond binary in modeling by taking advantage of Hyperbolic metric learning, while sample-wise consistency can further improve the generalization ability of the model using contrastive augmentation. More importantly, we design an adaptive strategy to integrate this information in a unified view. It is noteworthy that the new method can be extended to any pair-based metric loss. Extensive experimental results on benchmark datasets demonstrate that our method achieves state-of-the-art performance compared with current deep metric learning approaches.
Deep Metric Learning (DML) plays a critical role in various machine learning tasks. However, most existing deep metric learning methods with binary similarity are sensitive to noisy labels, which are widely present in real-world data. Since these noisy labels often cause a severe performance degradation, it is crucial to enhance the robustness and generalization ability of DML. In this paper, we propose an Adaptive Hierarchical Similarity Metric Learning method. It considers two noise-insensitive information, i.e., class-wise divergence and sample-wise consistency. Specifically, class-wise divergence can effectively excavate richer similarity information beyond binary in modeling by taking advantage of Hyperbolic metric learning, while sample-wise consistency can further improve the generalization ability of the model using contrastive augmentation. More importantly, we design an adaptive strategy to integrate this information in a unified view. It is noteworthy that the new method can be extended to any pair-based metric loss. Extensive experimental results on benchmark datasets demonstrate that our method achieves state-of-the-art performance compared with current deep metric learning approaches.Deep Metric Learning (DML) plays a critical role in various machine learning tasks. However, most existing deep metric learning methods with binary similarity are sensitive to noisy labels, which are widely present in real-world data. Since these noisy labels often cause a severe performance degradation, it is crucial to enhance the robustness and generalization ability of DML. In this paper, we propose an Adaptive Hierarchical Similarity Metric Learning method. It considers two noise-insensitive information, i.e., class-wise divergence and sample-wise consistency. Specifically, class-wise divergence can effectively excavate richer similarity information beyond binary in modeling by taking advantage of Hyperbolic metric learning, while sample-wise consistency can further improve the generalization ability of the model using contrastive augmentation. More importantly, we design an adaptive strategy to integrate this information in a unified view. It is noteworthy that the new method can be extended to any pair-based metric loss. Extensive experimental results on benchmark datasets demonstrate that our method achieves state-of-the-art performance compared with current deep metric learning approaches.
Author Huang, Heng
Deng, Cheng
Yan, Jiexi
Luo, Lei
Author_xml – sequence: 1
  givenname: Jiexi
  surname: Yan
  fullname: Yan, Jiexi
  organization: School of Computer Science and Technology, Xidian University, Xi'an, China
– sequence: 2
  givenname: Lei
  surname: Luo
  fullname: Luo, Lei
  organization: JD Finance American Corporation, Mountain View, CA, USA
– sequence: 3
  givenname: Cheng
  orcidid: 0000-0003-2620-3247
  surname: Deng
  fullname: Deng, Cheng
  organization: School of Electronic Engineering, Xidian University, Xi'an, China
– sequence: 4
  givenname: Heng
  orcidid: 0000-0002-3483-8333
  surname: Huang
  fullname: Huang, Heng
  organization: Department of Electrical and Computer Engineering, University of Pittsburgh, PA, USA
BackLink https://www.ncbi.nlm.nih.gov/pubmed/37022798$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1P20AQxVcoFeHrzqGqLPXSi9PZD6_tI0UUkAJFAs6r8XrSbOTY6e4GlP-ejRIkxKGH0czh90bz5h2zUT_0xNg5hwnnUP98un2YCBByIoUSXFUH7IjXiucASozSDEWZl1zVY3YcwgKAq4LrQzaWJQhR1tUR-3XR4iq6F8puHHn0du4sdtmjW7oOvYub7I6idzabEvre9X-zVxfn2f3gwiabYkNdOGVfZtgFOtv3E_b8--rp8iaf_rm-vbyY5laqKuZVpW1rbauLBqwGqVreSCFtSTPbVpLXGkUjC0tcSaiERKHRooACUc8QSJ6wH7u9Kz_8W1OIZumCpa7DnoZ1MMnQ1mstIKHfP6GLYe37dF2iSl3WhRBVor7tqXWzpNasvFui35j37yRA7wDrhxA8zYx1EaMb-ujRdYaD2cZgUgxmG4PZx5CE8En4vvs_kq87iSOiDzioVFq-Ac0xkJY
CODEN IIPRE4
CitedBy_id crossref_primary_10_1109_TGRS_2024_3407952
crossref_primary_10_1109_TIP_2024_3482182
crossref_primary_10_1080_03081079_2024_2430349
crossref_primary_10_1016_j_asoc_2025_113552
crossref_primary_10_1145_3653456
crossref_primary_10_1007_s11042_024_20114_0
crossref_primary_10_1007_s11263_024_02043_5
crossref_primary_10_1016_j_asoc_2025_113266
crossref_primary_10_1007_s12065_024_00990_z
crossref_primary_10_1109_TIP_2025_3555069
crossref_primary_10_1007_s11263_024_02175_8
crossref_primary_10_1016_j_displa_2025_103119
crossref_primary_10_1109_TMM_2024_3443616
crossref_primary_10_1109_LRA_2024_3477292
crossref_primary_10_1109_TNNLS_2025_3546903
crossref_primary_10_1109_TPAMI_2024_3382138
crossref_primary_10_1038_s41598_025_97528_9
crossref_primary_10_1016_j_knosys_2024_111791
Cites_doi 10.1109/CVPR.2017.240
10.1109/CVPR.2015.7298682
10.1007/978-3-030-01270-0_12
10.1007/978-3-319-24261-3_7
10.1109/CVPR42600.2020.00645
10.1007/s11263-015-0816-y
10.1109/MSP.2017.2693418
10.1109/CVPRW50498.2020.00359
10.1109/CVPR.2019.00535
10.1109/CVPR42600.2020.00642
10.1109/TIP.2017.2717505
10.1109/CVPR.2016.434
10.1109/CVPR.2005.202
10.1145/3446776
10.1109/CVPR.2017.237
10.1145/3292500.3330997
10.1109/CVPR42600.2020.00330
10.1109/CVPR.2006.100
10.1109/ICCV.2017.283
10.1109/CVPRW.2017.79
10.1109/CVPR.2018.00208
10.1007/978-3-030-01231-1_17
10.1109/CVPR.2016.158
10.1109/ICCV.2017.94
10.1007/978-3-030-01246-5_5
10.1109/ICPR.2014.16
10.1109/CVPR.2014.242
10.1109/ICCV.2017.47
10.1109/CVPR.2019.00265
10.1109/TIP.2019.2948472
10.1109/CVPR.2019.00742
10.1109/CVPR42600.2020.00643
10.1109/ICCVW.2013.77
10.1109/CVPR42600.2020.01374
10.1109/TIP.2017.2782366
10.1109/ICCV.2019.00655
10.1109/TIP.2018.2870941
10.1109/CVPR.2019.00020
10.1109/CVPR.2019.00510
10.1007/978-3-030-01246-5_45
10.1109/CVPR.2019.00516
10.1109/TIP.2014.2332398
10.1007/978-3-030-01264-9_9
10.1109/ICCV.2017.265
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
DOI 10.1109/TIP.2023.3242148
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library
CrossRef
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList PubMed
MEDLINE - Academic

Technology Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Xplore
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Applied Sciences
Engineering
EISSN 1941-0042
EndPage 1
ExternalDocumentID 37022798
10_1109_TIP_2023_3242148
10041006
Genre orig-research
Journal Article
GroupedDBID ---
-~X
.DC
0R~
29I
4.4
5GY
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AGQYO
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
F5P
HZ~
IFIPE
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
53G
5VS
AAYXX
ABFSI
AETIX
AGSQL
AI.
AIBXA
ALLEH
CITATION
E.L
EJD
H~9
ICLAB
IFJZH
VH1
AAYOK
NPM
PKN
RIG
Z5M
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ID FETCH-LOGICAL-c348t-886cdccd65b0c6034d1b323c7efcd83196a2b35ce1430823a26aca205aa6fa0e3
IEDL.DBID RIE
ISICitedReferencesCount 27
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000934998700005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1057-7149
1941-0042
IngestDate Tue Sep 30 21:55:24 EDT 2025
Mon Jun 30 10:14:41 EDT 2025
Wed Feb 19 02:24:34 EST 2025
Sat Nov 29 03:34:41 EST 2025
Tue Nov 18 21:18:39 EST 2025
Wed Aug 27 02:55:37 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c348t-886cdccd65b0c6034d1b323c7efcd83196a2b35ce1430823a26aca205aa6fa0e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-2620-3247
0000-0002-3483-8333
PMID 37022798
PQID 2776795228
PQPubID 85429
PageCount 1
ParticipantIDs pubmed_primary_37022798
crossref_citationtrail_10_1109_TIP_2023_3242148
crossref_primary_10_1109_TIP_2023_3242148
ieee_primary_10041006
proquest_journals_2776795228
proquest_miscellaneous_2797149920
PublicationCentury 2000
PublicationDate 2023-01-01
PublicationDateYYYYMMDD 2023-01-01
PublicationDate_xml – month: 01
  year: 2023
  text: 2023-01-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on image processing
PublicationTitleAbbrev TIP
PublicationTitleAlternate IEEE Trans Image Process
PublicationYear 2023
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
ref15
ref59
ref14
ref58
Chen (ref36)
ref53
Sohn (ref65)
ref52
ref11
ref10
ref54
ref17
ref16
Ioffe (ref67) 2015
ref19
ref18
ref51
Chen (ref56) 2020
Han (ref41) 2018
ref44
Chami (ref49)
Zhu (ref35) 2020
ref8
Tifrea (ref48) 2018
Berthelot (ref57) 2019
ref7
Mathieu (ref50)
ref9
ref4
Yu (ref60)
ref3
ref6
ref5
ref40
Khosla (ref37) 2020
ref34
ref31
Sugiyama (ref42)
Li (ref25) 2017
ref30
ref33
Paszke (ref66)
Van Der Maaten (ref70) 2014; 15
ref2
ref1
ref39
Arpit (ref38)
Ma (ref46)
Sohn (ref27) 2020
Nickel (ref47)
ref24
ref68
ref23
ref26
ref20
ref64
ref63
ref22
ref21
Li (ref45) 2020
ref28
ref29
Wah (ref62) 2011
Kingma (ref69) 2014
Ustinova (ref32)
ref61
Van Den Oord (ref55) 2018
Xia (ref43); 32
References_xml – ident: ref40
  doi: 10.1109/CVPR.2017.240
– ident: ref64
  doi: 10.1109/CVPR.2015.7298682
– year: 2020
  ident: ref37
  article-title: Supervised contrastive learning
  publication-title: arXiv:2004.11362
– ident: ref6
  doi: 10.1007/978-3-030-01270-0_12
– year: 2020
  ident: ref35
  article-title: Fewer is more: A deep graph metric learning perspective using fewer proxies
  publication-title: arXiv:2010.13636
– ident: ref22
  doi: 10.1007/978-3-319-24261-3_7
– ident: ref52
  doi: 10.1109/CVPR42600.2020.00645
– ident: ref68
  doi: 10.1007/s11263-015-0816-y
– ident: ref59
  doi: 10.1109/MSP.2017.2693418
– volume: 32
  start-page: 6838
  volume-title: Proc. NeurIPS
  ident: ref43
  article-title: Are anchor points really indispensable in label-noise learning?
– ident: ref28
  doi: 10.1109/CVPRW50498.2020.00359
– ident: ref17
  doi: 10.1109/CVPR.2019.00535
– ident: ref19
  doi: 10.1109/CVPR42600.2020.00642
– ident: ref10
  doi: 10.1109/TIP.2017.2717505
– ident: ref23
  doi: 10.1109/CVPR.2016.434
– ident: ref1
  doi: 10.1109/CVPR.2005.202
– year: 2018
  ident: ref55
  article-title: Representation learning with contrastive predictive coding
  publication-title: arXiv:1807.03748
– ident: ref39
  doi: 10.1145/3446776
– start-page: 233
  volume-title: Proc. ICML
  ident: ref38
  article-title: A closer look at memorization in deep networks
– start-page: 6338
  volume-title: Proc. NIPS
  ident: ref47
  article-title: Poincaré embeddings for learning hierarchical representations
– ident: ref14
  doi: 10.1109/CVPR.2017.237
– ident: ref51
  doi: 10.1145/3292500.3330997
– ident: ref20
  doi: 10.1109/CVPR42600.2020.00330
– ident: ref31
  doi: 10.1109/CVPR.2006.100
– ident: ref13
  doi: 10.1109/ICCV.2017.283
– year: 2015
  ident: ref67
  article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift
  publication-title: arXiv:1502.03167
– volume: 15
  start-page: 3221
  issue: 1
  year: 2014
  ident: ref70
  article-title: Accelerating t-SNE using tree-based algorithms
  publication-title: J. Mach. Learn. Res.
– ident: ref11
  doi: 10.1109/CVPRW.2017.79
– start-page: 4170
  volume-title: Proc. NIPS
  ident: ref32
  article-title: Learning deep embeddings with histogram loss
– ident: ref2
  doi: 10.1109/CVPR.2018.00208
– ident: ref24
  doi: 10.1007/978-3-030-01231-1_17
– start-page: 7164
  volume-title: Proc. ICML
  ident: ref60
  article-title: How does disagreement help generalization against label corruption?
– ident: ref4
  doi: 10.1109/CVPR.2016.158
– ident: ref63
  doi: 10.1109/ICCV.2017.94
– ident: ref26
  doi: 10.1007/978-3-030-01246-5_5
– year: 2011
  ident: ref62
  article-title: The Caltech-UCSD birds-200-2011 dataset
– start-page: 4868
  volume-title: Proc. NIPS
  ident: ref49
  article-title: Hyperbolic graph convolutional neural networks
– ident: ref5
  doi: 10.1109/ICPR.2014.16
– year: 2019
  ident: ref57
  article-title: ReMixMatch: Semi-supervised learning with distribution alignment and augmentation anchoring
  publication-title: arXiv:1911.09785
– year: 2020
  ident: ref27
  article-title: FixMatch: Simplifying semi-supervised learning with consistency and confidence
  publication-title: arXiv:2001.07685
– start-page: 1
  volume-title: Proc. NeurIPS
  ident: ref42
  article-title: Co-teaching: Robust training of deep neural networks with extremely noisy labels
– ident: ref8
  doi: 10.1109/CVPR.2014.242
– ident: ref33
  doi: 10.1109/ICCV.2017.47
– ident: ref54
  doi: 10.1109/CVPR.2019.00265
– start-page: 8026
  volume-title: Proc. NIPS
  ident: ref66
  article-title: PyTorch: An imperative style, high-performance deep learning library
– ident: ref15
  doi: 10.1109/TIP.2019.2948472
– ident: ref18
  doi: 10.1109/CVPR.2019.00742
– ident: ref21
  doi: 10.1109/CVPR42600.2020.00643
– ident: ref61
  doi: 10.1109/ICCVW.2013.77
– ident: ref44
  doi: 10.1109/CVPR42600.2020.01374
– ident: ref12
  doi: 10.1109/TIP.2017.2782366
– ident: ref34
  doi: 10.1109/ICCV.2019.00655
– ident: ref7
  doi: 10.1109/TIP.2018.2870941
– year: 2018
  ident: ref48
  article-title: Poincaré GloVe: Hyperbolic word embeddings
  publication-title: arXiv:1810.06546
– ident: ref58
  doi: 10.1109/CVPR.2019.00020
– start-page: 3355
  volume-title: Proc. ICML
  ident: ref46
  article-title: Dimensionality-driven learning with noisy labels
– start-page: 12565
  volume-title: Proc. NIPS
  ident: ref50
  article-title: Continuous hierarchical representations with Poincaré variational auto-encoders
– year: 2020
  ident: ref45
  article-title: DivideMix: Learning with noisy labels as semi-supervised learning
  publication-title: arXiv:2002.07394
– year: 2014
  ident: ref69
  article-title: Adam: A method for stochastic optimization
  publication-title: arXiv:1412.6980
– start-page: 1857
  volume-title: Proc. NIPS
  ident: ref65
  article-title: Improved deep metric learning with multi-class n-pair loss objective
– ident: ref30
  doi: 10.1109/CVPR.2019.00510
– year: 2018
  ident: ref41
  article-title: Co-teaching: Robust training of deep neural networks with extremely noisy labels
  publication-title: arXiv:1804.06872
– ident: ref16
  doi: 10.1007/978-3-030-01246-5_45
– ident: ref29
  doi: 10.1109/CVPR.2019.00516
– ident: ref9
  doi: 10.1109/TIP.2014.2332398
– ident: ref53
  doi: 10.1007/978-3-030-01264-9_9
– year: 2020
  ident: ref56
  article-title: Improved baselines with momentum contrastive learning
  publication-title: arXiv:2003.04297
– start-page: 1597
  volume-title: Proc. ICML
  ident: ref36
  article-title: A simple framework for contrastive learning of visual representations
– ident: ref3
  doi: 10.1109/ICCV.2017.265
– year: 2017
  ident: ref25
  article-title: WebVision database: Visual learning and understanding from web data
  publication-title: arXiv:1708.02862
SSID ssj0014516
Score 2.568089
Snippet Deep Metric Learning (DML) plays a critical role in various machine learning tasks. However, most existing deep metric learning methods with binary similarity...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1
SubjectTerms Adaptation models
Cognitive tasks
Consistency
Contrastive Augmentation
Deep Metric Learning
Geometry
Hierarchical Similarity
Hyperbolic Geometry
Labels
Machine learning
Measurement
Noise measurement
Noisy Labels
Performance degradation
Robustness
Similarity
Task analysis
Training
Title Adaptive Hierarchical Similarity Metric Learning with Noisy Labels
URI https://ieeexplore.ieee.org/document/10041006
https://www.ncbi.nlm.nih.gov/pubmed/37022798
https://www.proquest.com/docview/2776795228
https://www.proquest.com/docview/2797149920
Volume 32
WOSCitedRecordID wos000934998700005&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Xplore
  customDbUrl:
  eissn: 1941-0042
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014516
  issn: 1057-7149
  databaseCode: RIE
  dateStart: 19920101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEB60eNCD1fqqLyJ48bDtvrM5VrEoaBGs0tuSnaSyUFuxreC_N5NNSy8KXpaFze6GzGTnS2bn-wAu0UCQROrECxVHLzZBwBOp0l4h48J03MQwqxn5-sB7vWwwEE-uWN3Wwmit7c9nukWnNpevJjinrbI2sZsFlmB7nfO0KtZapgxIcdamNhPucYP7FzlJX7T7908tkglvEXoISOpnJQZZUZXf8aWNM936P3u4A9sOULJO5QG7sKbHDag7cMnc1J02YGuFeXAPrjtKftCXjt2VVIJsFVFG7Ll8L81S1yBz9khSW8gc_-obow1b1puU02_2IAsTUffhpXvbv7nznJyCh1GczbwsS1EhqjQpfEz9KFZBEYURcj1EldFUlGERJagNhKL8mwxTiZJMKdOh9HV0ALXxZKyPgKHPhR_HMkWD-MxBhqjVsCgCHaNUKJrQXgxwjo5rnCQvRrldc_giNybJySS5M0kTrpZ3fFQ8G3-03aeRX2lXDXoTThdGzN1MnOYh0RUJgzLNbRfLy2YOUWJEjvVkTm0EeYwI_SYcVsZfPjzilmQxO_7lpSewSX2rdmVOoTb7nOsz2MCvWTn9PDeOOsjOraP-AJfV4v0
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS8MwFD6ICuqD98u8RvDFh25Zm17yqKJMnENwim8lPcmkoJu4TfDfm5NmYy8KvpRC0zbk5PR8yen5PoAztBAkViYOQp1iIGwQCGSiTVAoUdiO2xjmNCOf22mnk728yAdfrO5qYYwx7uczU6dTl8vXAxzTVlmD2M2ajmB7IRYi5FW51jRpQJqzLrkZp0Fqkf8kK8llo3v7UCeh8DrhhyaJ_cxEISer8jvCdJHmZu2ffVyHVQ8p2UU1BzZgzvQ3Yc3DS-add7gJKzPcg1tweaHVB33rWKukImSnifLGHsv30i52LTZn9yS2hcwzsL4y2rJlnUE5_GZtVdiYug1PN9fdq1bgBRUCjEQ2CrIsQY2ok7jgmPBI6GYRhRGmpoc6I2dUYRHFaCyIogycChOFioypkp7iJtqB-f6gb_aAIU8lF0IlaDGfPagQje4VRdMIVBplDRqTAc7Rs42T6MVb7lYdXObWJDmZJPcmqcH59I6Pimnjj7bbNPIz7apBr8HhxIi598VhHhJhkbQ40952Or1svYhSI6pvBmNqI2nGyJDXYLcy_vThUepoFrP9X156Akut7n07b9927g5gmfpZ7dEcwvzoc2yOYBG_RuXw89hN1x8zKOVc
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Adaptive+Hierarchical+Similarity+Metric+Learning+With+Noisy+Labels&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Yan%2C+Jiexi&rft.au=Luo%2C+Lei&rft.au=Deng%2C+Cheng&rft.au=Huang%2C+Heng&rft.date=2023-01-01&rft.issn=1057-7149&rft.eissn=1941-0042&rft.volume=32&rft.spage=1245&rft.epage=1256&rft_id=info:doi/10.1109%2FTIP.2023.3242148&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TIP_2023_3242148
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon