Adaptive Learning for Dynamic Features and Noisy Labels

Applying current machine learning algorithms in complex and open environments remains challenging, especially when different changing elements are coupled and the training data is scarce. For example, in the activity recognition task, the motion sensors may change position or fall off due to the int...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on pattern analysis and machine intelligence Ročník 47; číslo 2; s. 1219 - 1237
Hlavní autori: Gu, Shilin, Xu, Chao, Hu, Dewen, Hou, Chenping
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.02.2025
Predmet:
ISSN:0162-8828, 1939-3539, 2160-9292, 1939-3539
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Applying current machine learning algorithms in complex and open environments remains challenging, especially when different changing elements are coupled and the training data is scarce. For example, in the activity recognition task, the motion sensors may change position or fall off due to the intensity of the activity, leading to changes in feature space and finally resulting in label noise. Learning from such a problem where the dynamic features are coupled with noisy labels is crucial but rarely studied, particularly when the noisy samples in new feature space are limited. In this paper, we tackle the above problem by proposing a novel two-stage algorithm, called Adaptive Learning for Dynamic features and Noisy labels (ALDN). Specifically, optimal transport is first modified to map the previously learned heterogeneous model to the prior model of the current stage. Then, to fully reuse the mapped prior model, we add a simple yet efficient regularizer as the consistency constraint to assist both the estimation of the noise transition matrix and the model training in the current stage. Finally, two implementations with direct (ALDN-D) and indirect (ALDN-ID) constraints are illustrated for better investigation. More importantly, we provide theoretical guarantees for risk minimization of ALDN-D and ALDN-ID. Extensive experiments validate the effectiveness of the proposed algorithms.
AbstractList Applying current machine learning algorithms in complex and open environments remains challenging, especially when different changing elements are coupled and the training data is scarce. For example, in the activity recognition task, the motion sensors may change position or fall off due to the intensity of the activity, leading to changes in feature space and finally resulting in label noise. Learning from such a problem where the dynamic features are coupled with noisy labels is crucial but rarely studied, particularly when the noisy samples in new feature space are limited. In this paper, we tackle the above problem by proposing a novel two-stage algorithm, called Adaptive Learning for Dynamic features and Noisy labels (ALDN). Specifically, optimal transport is first modified to map the previously learned heterogeneous model to the prior model of the current stage. Then, to fully reuse the mapped prior model, we add a simple yet efficient regularizer as the consistency constraint to assist both the estimation of the noise transition matrix and the model training in the current stage. Finally, two implementations with direct (ALDN-D) and indirect (ALDN-ID) constraints are illustrated for better investigation. More importantly, we provide theoretical guarantees for risk minimization of ALDN-D and ALDN-ID. Extensive experiments validate the effectiveness of the proposed algorithms.
Applying current machine learning algorithms in complex and open environments remains challenging, especially when different changing elements are coupled and the training data is scarce. For example, in the activity recognition task, the motion sensors may change position or fall off due to the intensity of the activity, leading to changes in feature space and finally resulting in label noise. Learning from such a problem where the dynamic features are coupled with noisy labels is crucial but rarely studied, particularly when the noisy samples in new feature space are limited. In this paper, we tackle the above problem by proposing a novel two-stage algorithm, called Adaptive Learning for Dynamic features and Noisy labels (ALDN). Specifically, optimal transport is first modified to map the previously learned heterogeneous model to the prior model of the current stage. Then, to fully reuse the mapped prior model, we add a simple yet efficient regularizer as the consistency constraint to assist both the estimation of the noise transition matrix and the model training in the current stage. Finally, two implementations with direct (ALDN-D) and indirect (ALDN-ID) constraints are illustrated for better investigation. More importantly, we provide theoretical guarantees for risk minimization of ALDN-D and ALDN-ID. Extensive experiments validate the effectiveness of the proposed algorithms.Applying current machine learning algorithms in complex and open environments remains challenging, especially when different changing elements are coupled and the training data is scarce. For example, in the activity recognition task, the motion sensors may change position or fall off due to the intensity of the activity, leading to changes in feature space and finally resulting in label noise. Learning from such a problem where the dynamic features are coupled with noisy labels is crucial but rarely studied, particularly when the noisy samples in new feature space are limited. In this paper, we tackle the above problem by proposing a novel two-stage algorithm, called Adaptive Learning for Dynamic features and Noisy labels (ALDN). Specifically, optimal transport is first modified to map the previously learned heterogeneous model to the prior model of the current stage. Then, to fully reuse the mapped prior model, we add a simple yet efficient regularizer as the consistency constraint to assist both the estimation of the noise transition matrix and the model training in the current stage. Finally, two implementations with direct (ALDN-D) and indirect (ALDN-ID) constraints are illustrated for better investigation. More importantly, we provide theoretical guarantees for risk minimization of ALDN-D and ALDN-ID. Extensive experiments validate the effectiveness of the proposed algorithms.
Author Gu, Shilin
Hou, Chenping
Xu, Chao
Hu, Dewen
Author_xml – sequence: 1
  givenname: Shilin
  orcidid: 0000-0003-1681-5856
  surname: Gu
  fullname: Gu, Shilin
  email: gslnudt@outlook.com
  organization: College of Science, National University of Defense Technology, Changsha, China
– sequence: 2
  givenname: Chao
  orcidid: 0009-0009-4322-0699
  surname: Xu
  fullname: Xu, Chao
  email: xcnudt@hotmail.com
  organization: College of Science, National University of Defense Technology, Changsha, China
– sequence: 3
  givenname: Dewen
  orcidid: 0000-0001-7357-0053
  surname: Hu
  fullname: Hu, Dewen
  email: dwhu@nudt.edu.cn
  organization: College of Intelligence Science and Technology, National University of Defense Technology, Changsha, China
– sequence: 4
  givenname: Chenping
  orcidid: 0000-0002-9335-0469
  surname: Hou
  fullname: Hou, Chenping
  email: houchenping@nudt.edu.cn
  organization: College of Science, National University of Defense Technology, Changsha, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/39480720$$D View this record in MEDLINE/PubMed
BookMark eNp9kMtKw0AUhgep2Iu-gIhk6SZ1rpnJslSrhXhZ1HWYyZzISJrUmUTo25vaKuJCOHA233cO_z9Gg7qpAaFzgqeE4PR69Tx7WE4ppnzKuEopkUdoREmC45SmdIBGmCQ0VoqqIRqH8IYx4QKzEzRkKVdYUjxCcmb1pnUfEGWgfe3q16hsfHSzrfXaFdECdNt5CJGubfTYuLCNMm2gCqfouNRVgLPDnqCXxe1qfh9nT3fL-SyLC0Z5GwtagFUisUzJ0phSCMGNtSXBRpt-lBY4sSplTCapUVISaZjlRpYJFJwSNkFX-7sb37x3ENp87UIBVaVraLqQM0JZn4SRtEcvD2hn1mDzjXdr7bf5d9geoHug8E0IHsofhOB812j-1Wi-azQ_NNpL6o9UuFa3rqlbr131v3qxVx0A_PolORZCsU-fNoI_
CODEN ITPIDJ
CitedBy_id crossref_primary_10_1016_j_patcog_2025_111719
Cites_doi 10.1109/TPAMI.2017.2769047
10.1109/CVPR.2017.240
10.1109/TSP.2012.2218810
10.1609/aaai.v33i01.33015256
10.1109/TPAMI.2015.2456899
10.1016/j.patcog.2021.108362
10.1109/ICCV.2015.168
10.1007/978-3-319-46379-7_1
10.1109/TPAMI.2020.2994749
10.1007/s10479-005-5724-z
10.1007/s11704-016-6906-3
10.1109/TKDE.2016.2563424
10.1007/978-3-031-19803-8_1
10.1007/978-3-540-71050-9
10.1609/aaai.v33i01.33013232
10.1109/TKDE.2021.3061215
10.1109/TNNLS.2020.2981386
10.1609/aaai.v35i9.16944
10.1109/TNNLS.2022.3152527
10.1109/CVPR52688.2022.01613
10.1016/j.neunet.2017.10.007
10.3390/s140609995
10.1609/aaai.v35i5.16532
10.1613/jair.1.12125
10.1109/CVPR.2015.7298885
10.1007/s11704-016-5489-3
10.1109/BigData47090.2019.9006373
10.1109/TNNLS.2013.2292894
10.1609/aaai.v31i1.10894
10.14778/3157794.3157797
10.1145/3534678.3539351
10.1109/TNNLS.2022.3178880
10.1007/978-3-030-01249-6_38
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7X8
DOI 10.1109/TPAMI.2024.3489217
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
MEDLINE - Academic
DatabaseTitleList PubMed

MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 2160-9292
1939-3539
EndPage 1237
ExternalDocumentID 39480720
10_1109_TPAMI_2024_3489217
10740558
Genre orig-research
Journal Article
GrantInformation_xml – fundername: Key NSF of China
  grantid: 62136005; 62036013
– fundername: National Science Fund for Distinguished Young Scholars
  grantid: 62425607
  funderid: 10.13039/501100014219
GroupedDBID ---
-DZ
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
9M8
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ADRHT
AENEX
AETEA
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
FA8
HZ~
H~9
IBMZZ
ICLAB
IEDLZ
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNI
RNS
RXW
RZB
TAE
TN5
UHB
VH1
XJT
~02
AAYXX
CITATION
AAYOK
NPM
RIG
7X8
ID FETCH-LOGICAL-c324t-52ced856d387fbbf5554bddf10babbab8a506d8933769b87717b3d4b7f6ec4213
IEDL.DBID RIE
ISICitedReferencesCount 1
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001395340500023&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0162-8828
1939-3539
IngestDate Sun Sep 28 10:47:31 EDT 2025
Wed Mar 05 02:44:41 EST 2025
Sat Nov 29 02:58:28 EST 2025
Tue Nov 18 22:27:46 EST 2025
Wed Aug 27 01:58:00 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 2
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c324t-52ced856d387fbbf5554bddf10babbab8a506d8933769b87717b3d4b7f6ec4213
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0002-9335-0469
0000-0003-1681-5856
0000-0001-7357-0053
0009-0009-4322-0699
PMID 39480720
PQID 3123072319
PQPubID 23479
PageCount 19
ParticipantIDs ieee_primary_10740558
proquest_miscellaneous_3123072319
crossref_primary_10_1109_TPAMI_2024_3489217
pubmed_primary_39480720
crossref_citationtrail_10_1109_TPAMI_2024_3489217
PublicationCentury 2000
PublicationDate 2025-02-01
PublicationDateYYYYMMDD 2025-02-01
PublicationDate_xml – month: 02
  year: 2025
  text: 2025-02-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle IEEE transactions on pattern analysis and machine intelligence
PublicationTitleAbbrev TPAMI
PublicationTitleAlternate IEEE Trans Pattern Anal Mach Intell
PublicationYear 2025
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref56
Yao (ref11)
ref15
Liu (ref44)
ref55
Han (ref10)
ref54
Zhang (ref43)
ref17
ref16
ref19
ref18
Hou (ref7)
Mohri (ref50) 2012
ref51
Cheng (ref34)
Li (ref24)
Cheng (ref38)
ref46
Cha (ref52) 2007; 1
ref48
ref47
ref42
Yao (ref26)
ref49
Goldberger (ref33)
ref8
Perkins (ref14)
Vahdat (ref25)
ref9
ref4
ref3
ref6
Goodfellow (ref41)
ref40
Bartlett (ref53) 2002; 3
Natarajan (ref12)
ref35
ref37
ref31
ref32
ref2
ref1
ref39
Crammer (ref21) 2006; 7
Cheng (ref36)
Kremer (ref23)
Patrini (ref30)
ref20
Zhang (ref28)
ref22
ref27
ref29
Han (ref5) 2020
Villani (ref45) 2009; 338
References_xml – ident: ref3
  doi: 10.1109/TPAMI.2017.2769047
– start-page: 5596
  volume-title: Proc. Adv. Neural Inf. Process. Syst. 30: Annu. Conf. Neural Inf. Process. Syst.
  ident: ref25
  article-title: Toward robustness against label noise in training deep discriminative neural networks
– start-page: 1789
  volume-title: Proc. 37th Int. Conf. Mach. Learn.
  ident: ref36
  article-title: Learning with bounded instance and label-dependent label noise
– ident: ref13
  doi: 10.1109/CVPR.2017.240
– ident: ref46
  doi: 10.1109/TSP.2012.2218810
– start-page: 1196
  volume-title: Proc. Adv. Neural Inf. Process. Syst. 26: 27th Annu. Conf. Neural Inf. Process. Syst.
  ident: ref12
  article-title: Learning with noisy labels
– ident: ref51
  doi: 10.1609/aaai.v33i01.33015256
– start-page: 308
  volume-title: Proc. Int. Conf. Artif. Intell. Statist.
  ident: ref23
  article-title: Robust active label correction
– ident: ref27
  doi: 10.1109/TPAMI.2015.2456899
– volume-title: Proc. 9th Int. Conf. Learn. Representations
  ident: ref34
  article-title: Learning with instance-dependent label noise: A sample sieve approach
– ident: ref55
  doi: 10.1016/j.patcog.2021.108362
– ident: ref39
  doi: 10.1109/ICCV.2015.168
– ident: ref49
  doi: 10.1007/978-3-319-46379-7_1
– ident: ref9
  doi: 10.1109/TPAMI.2020.2994749
– start-page: 4006
  volume-title: Proc. 37th Int. Conf. Mach. Learn.
  ident: ref10
  article-title: SIGUA: Forgetting may make learning with noisy labels more robust
– ident: ref47
  doi: 10.1007/s10479-005-5724-z
– volume-title: Proc. Adv. Neural Inf. Process. Syst. 35: Annu. Conf. Neural Inf. Process. Syst.
  ident: ref38
  article-title: Class-dependent label-noise learning with cycle-consistency regularization
– ident: ref1
  doi: 10.1007/s11704-016-6906-3
– ident: ref6
  doi: 10.1109/TKDE.2016.2563424
– start-page: 1417
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  ident: ref7
  article-title: Learning with feature evolvable streams
– start-page: 8792
  volume-title: Proc. Adv. Neural Inf. Process. Syst. 31: Annu. Conf. Neural Inf. Process. Syst.
  ident: ref43
  article-title: Generalized cross entropy loss for training deep neural networks with noisy labels
– ident: ref22
  doi: 10.1007/978-3-031-19803-8_1
– start-page: 708
  volume-title: Proc. 33nd Int. Conf. Mach. Learn.
  ident: ref30
  article-title: Loss factorization, weakly supervised learning and label noise robustness
– volume: 338
  volume-title: Optimal Transport: Old and New
  year: 2009
  ident: ref45
  doi: 10.1007/978-3-540-71050-9
– ident: ref17
  doi: 10.1609/aaai.v33i01.33013232
– start-page: 6226
  volume-title: Proc. 37th Int. Conf. Mach. Learn.
  ident: ref44
  article-title: Peer loss functions: Learning from noisy labels without knowing noise rates
– start-page: 592
  volume-title: Proc. 20th Int. Conf. Mach. Learn.
  ident: ref14
  article-title: Online feature selection using grafting
– ident: ref31
  doi: 10.1109/TKDE.2021.3061215
– start-page: 4313
  volume-title: Proc. 23rd Int. Conf. Artif. Intell. Statist.
  ident: ref24
  article-title: Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks
– year: 2020
  ident: ref5
  article-title: A survey of label-noise representation learning: Past, present and future
– ident: ref18
  doi: 10.1109/TNNLS.2020.2981386
– volume-title: Proc. 5th Int. Conf. Learn. Representations
  ident: ref33
  article-title: Training deep neural-networks using a noise adaptation layer
– ident: ref16
  doi: 10.1609/aaai.v35i9.16944
– volume-title: Proc. 3rd Int. Conf. Learn. Representations
  ident: ref41
  article-title: Explaining and harnessing adversarial examples
– volume: 7
  start-page: 551
  year: 2006
  ident: ref21
  article-title: Online passive-aggressive algorithms
  publication-title: J. Mach. Learn. Res.
– volume-title: Foundations of Machine Learning
  year: 2012
  ident: ref50
– volume: 1
  start-page: 300
  issue: 4
  year: 2007
  ident: ref52
  article-title: Comprehensive survey on distance/similarity measures between probability density functions
  publication-title: Int. J. Math. Models Methods Appl. Sci.
– ident: ref35
  doi: 10.1109/TNNLS.2022.3152527
– ident: ref48
  doi: 10.1109/CVPR52688.2022.01613
– ident: ref56
  doi: 10.1016/j.neunet.2017.10.007
– ident: ref2
  doi: 10.3390/s140609995
– ident: ref19
  doi: 10.1609/aaai.v35i5.16532
– start-page: 10789
  volume-title: Proc. 37th Int. Conf. Mach. Learn.
  ident: ref11
  article-title: Searching to exploit memorization effect in learning with noisy labels
– ident: ref54
  doi: 10.1613/jair.1.12125
– ident: ref32
  doi: 10.1109/CVPR.2015.7298885
– ident: ref15
  doi: 10.1007/s11704-016-5489-3
– ident: ref8
  doi: 10.1109/BigData47090.2019.9006373
– volume: 3
  start-page: 463
  year: 2002
  ident: ref53
  article-title: Rademacher and gaussian complexities: Risk bounds and structural results
  publication-title: J. Mach. Learn. Res.
– ident: ref4
  doi: 10.1109/TNNLS.2013.2292894
– start-page: 8792
  volume-title: Proc. Adv. Neural Inf. Process. Syst. 31: Annu. Conf. Neural Inf. Process. Syst.
  ident: ref28
  article-title: Generalized cross entropy loss for training deep neural networks with noisy labels
– ident: ref42
  doi: 10.1609/aaai.v31i1.10894
– ident: ref37
  doi: 10.14778/3157794.3157797
– ident: ref29
  doi: 10.1145/3534678.3539351
– ident: ref20
  doi: 10.1109/TNNLS.2022.3178880
– start-page: 7260
  volume-title: Proc. Adv. Neural Inf. Process. Syst. 33: Annu. Conf. Neural Inf. Process. Syst.
  ident: ref26
  article-title: Dual T: Reducing estimation error for transition matrix in label-noise learning
– ident: ref40
  doi: 10.1007/978-3-030-01249-6_38
SSID ssj0014503
Score 2.4780343
Snippet Applying current machine learning algorithms in complex and open environments remains challenging, especially when different changing elements are coupled and...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1219
SubjectTerms Adaptation models
Adaptive learning
Data models
Dynamic feature space
Feature extraction
heterogeneous model reuse
Heuristic algorithms
incremental learning
Noise
Noise measurement
noisy labels
Risk minimization
Streams
Training
Title Adaptive Learning for Dynamic Features and Noisy Labels
URI https://ieeexplore.ieee.org/document/10740558
https://www.ncbi.nlm.nih.gov/pubmed/39480720
https://www.proquest.com/docview/3123072319
Volume 47
WOSCitedRecordID wos001395340500023&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2160-9292
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014503
  issn: 0162-8828
  databaseCode: RIE
  dateStart: 19790101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5URPTg-7G-iOBNqu0maZLj4gMFXTwo7K3kVVmQrri7gv_eSdoue1EQeig0KWUeyTfNzHwA59x1c0u9TiijacIU9YmR2iWZyTU6n8_TuknSo-j35WCgnpti9VgL472PyWf-MtzGs3w3stPwq-wqJA-mnMtFWBQir4u1ZkcGjEcaZIQw6OIYR7QVMqm6ennuPT1gLNhll5RJhSh8FVaoCtXUged7bkOKDCu_g8246dxt_PNzN2G9QZekV5vDFiz4ahs2WuYG0jjyNqzNtSHcAdFz-iMse6TptvpGEMqSm5qsngSUOMWonOjKkf5oOP4mj9rgnroLr3e3L9f3SUOokFjETRMMOq13kueOSlEaU3LEEsa5MkuNNnhJzdPcIYLBVUcZKTDUM9QxI8rcW9bN6B4sVaPKHwBxSqDvlzl6MGNeccVL663QhmJQh886kLVSLWzTbTyQXrwXMepIVRGVUgSlFI1SOnAxm_NR99r4c_RuEPncyFraHThrtVegq4TzD1350XRc0CxkvSOgVR3Yr9U6m91aw-Evbz2C1W5g_o352sewNPmc-hNYtl-T4fjzFO1xIE-jPf4AK5PW-A
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1JS8QwFH644XJwX8Y1gjeptk3SNsfBBcVx8DCCt5KtIkhnmEXw3_uStsNcFIQeCk1KyctLvq95730AF9zEiaZWBpTRMGCC2kBl0gSRSiQ6n03CqkhSJ-12s7c38VInq_tcGGutDz6zV-7Wn-Wbvp64X2XXLngw5Dybh0XOWBxW6VrTQwPGvRAyghh0cmQSTY5MKK57L-3nR2SDMbuiLBOIw1dhmQqXT-2Uvme2JK-x8jvc9NvO_cY_P3gT1mt8SdrVhNiCOVtuw0aj3UBqV96GtZlChDuQto0cuIWP1PVW3wmCWXJbydUThxMnyMuJLA3p9j9G36QjFe6qu_B6f9e7eQhqSYVAI3IaI-3U1mQ8MTRLC6UKjmhCGVNEoZIKr0zyMDGIYXDdESpLkewpaphKi8RqFkd0DxbKfmkPgBiRovcXCfowY1ZwwQttdSoVRVqHz1oQNaOa67reuJO9-Mw97whF7o2SO6PktVFacDntM6iqbfzZetcN-UzLarRbcN5YL0dncScgsrT9ySinkYt7R0grWrBfmXXau5kNh7-89QxWHnrPnbzz2H06gtXY6QD76O1jWBgPJ_YElvTX-GM0PPWz8gd1BtlX
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Adaptive+Learning+for+Dynamic+Features+and+Noisy+Labels&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Gu%2C+Shilin&rft.au=Xu%2C+Chao&rft.au=Hu%2C+Dewen&rft.au=Hou%2C+Chenping&rft.date=2025-02-01&rft.pub=IEEE&rft.issn=0162-8828&rft.volume=47&rft.issue=2&rft.spage=1219&rft.epage=1237&rft_id=info:doi/10.1109%2FTPAMI.2024.3489217&rft_id=info%3Apmid%2F39480720&rft.externalDocID=10740558
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon