Robust Federated Learning With Noisy Labeled Data Through Loss Function Correction
Federated learning (FL) is a communication-efficient machine learning paradigm to leverage distributed data at the network edge. Nevertheless, FL usually fails to train a high-quality model from the networks, where the edge nodes collect noisy labeled data. To tackle this challenge, this paper focus...
Uložené v:
| Vydané v: | IEEE transactions on network science and engineering Ročník 10; číslo 3; s. 1 - 11 |
|---|---|
| Hlavní autori: | , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Piscataway
IEEE
01.05.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Predmet: | |
| ISSN: | 2327-4697, 2334-329X |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | Federated learning (FL) is a communication-efficient machine learning paradigm to leverage distributed data at the network edge. Nevertheless, FL usually fails to train a high-quality model from the networks, where the edge nodes collect noisy labeled data. To tackle this challenge, this paper focuses on developing an innovative robust FL. We consider two kinds of networks with different data distribution. Firstly, we design a reweighted FL under a full-data network, where all edge nodes are equipped with both numerous noisy labeled dataset and small clean dataset. The key idea is that edge devices learn to assign the local weights of loss functions in noisy labeled dataset, and cooperate with central server to update global weights. Secondly, we consider a part-data network where some edge nodes exclude clean dataset, and can not compute the weights locally. The broadcasting of the global weights is added to help those edge nodes without clean dataset to reweight their noisy loss functions. Both designs have a convergence rate of <inline-formula><tex-math notation="LaTeX">\mathcal {O}(1/T^{2})</tex-math></inline-formula>. Simulation results illustrate that the both proposed training processes improve the prediction accuracy due to the proper weights assignments of noisy loss function. |
|---|---|
| AbstractList | Federated learning (FL) is a communication-efficient machine learning paradigm to leverage distributed data at the network edge. Nevertheless, FL usually fails to train a high-quality model from the networks, where the edge nodes collect noisy labeled data. To tackle this challenge, this paper focuses on developing an innovative robust FL. We consider two kinds of networks with different data distribution. Firstly, we design a reweighted FL under a full-data network, where all edge nodes are equipped with both numerous noisy labeled dataset and small clean dataset. The key idea is that edge devices learn to assign the local weights of loss functions in noisy labeled dataset, and cooperate with central server to update global weights. Secondly, we consider a part-data network where some edge nodes exclude clean dataset, and can not compute the weights locally. The broadcasting of the global weights is added to help those edge nodes without clean dataset to reweight their noisy loss functions. Both designs have a convergence rate of <inline-formula><tex-math notation="LaTeX">\mathcal {O}(1/T^{2})</tex-math></inline-formula>. Simulation results illustrate that the both proposed training processes improve the prediction accuracy due to the proper weights assignments of noisy loss function. Federated learning (FL) is a communication-efficient machine learning paradigm to leverage distributed data at the network edge. Nevertheless, FL usually fails to train a high-quality model from the networks, where the edge nodes collect noisy labeled data. To tackle this challenge, this paper focuses on developing an innovative robust FL. We consider two kinds of networks with different data distribution. Firstly, we design a reweighted FL under a full-data network, where all edge nodes are equipped with both numerous noisy labeled dataset and small clean dataset. The key idea is that edge devices learn to assign the local weights of loss functions in noisy labeled dataset, and cooperate with central server to update global weights. Secondly, we consider a part-data network where some edge nodes exclude clean dataset, and can not compute the weights locally. The broadcasting of the global weights is added to help those edge nodes without clean dataset to reweight their noisy loss functions. Both designs have a convergence rate of [Formula Omitted]. Simulation results illustrate that the both proposed training processes improve the prediction accuracy due to the proper weights assignments of noisy loss function. |
| Author | Chen, Yunfei Chen, Li Ang, Fan |
| Author_xml | – sequence: 1 givenname: Li orcidid: 0000-0002-1754-0607 surname: Chen fullname: Chen, Li organization: Department of Electronic Engineering and Information Science, University of Science and Technology of China, China – sequence: 2 givenname: Fan orcidid: 0000-0003-2569-745X surname: Ang fullname: Ang, Fan organization: Department of Electronic Engineering and Information Science, University of Science and Technology of China, China – sequence: 3 givenname: Yunfei orcidid: 0000-0001-8083-1805 surname: Chen fullname: Chen, Yunfei organization: School of Engineering, University of Warwick, Coventry, U.K |
| BookMark | eNp9kDFPwzAQhS1UJErpD0AslphT7HMbxyMqLSBFRSpFsEV2cmlTlbg4ztB_j0MrBgamO-neu_vuXZJebWsk5JqzEedM3a0Wr7MRMICRAJCQyDPSByHGkQD10et6kNE4VvKCDJtmyxjjkMRCiD5ZLq1pG0_nWKDTHguaonZ1Va_pe-U3dGGr5kBTbXAXZg_aa7raONuuNzS1TUPnbZ37ytZ0ap3Dn_aKnJd61-DwVAfkbT5bTZ-i9OXxeXqfRjko4aPAyRNTIhquUCupJmWSawnhIywDXKAsE8bQFNpIJaGQzMRg4pLDGIUWYkBuj3v3zn612Phsa1tXh5MZJCwGHiuVBJU8qnIXeB2WWV553XF6p6tdxlnWZZh1GWZdhtkpw-Dkf5x7V31qd_jXc3P0VIj4q1dKCjHh4htIG34M |
| CODEN | ITNSD5 |
| CitedBy_id | crossref_primary_10_1109_TBDATA_2025_3527202 crossref_primary_10_1080_23742917_2024_2448355 crossref_primary_10_1111_exsy_70096 crossref_primary_10_1109_TNSE_2023_3320123 crossref_primary_10_1016_j_mex_2025_103408 crossref_primary_10_1145_3727643 crossref_primary_10_1109_TSIPN_2025_3572292 |
| Cites_doi | 10.1561/9781680837896 10.1109/ICCV.2015.168 10.1109/TWC.2020.2974748 10.1109/TWC.2021.3099505 10.1109/CVPR.2016.90 10.1109/JIOT.2020.3023126 10.1109/TNSE.2020.3014385 10.1109/MWC.001.1900119 10.1109/TNNLS.2022.3152527 10.1109/CVPR.2015.7298885 10.1109/JIOT.2020.3027980 10.1109/MCOM.2019.1900271 10.1007/s10915-018-0757-z 10.1109/TII.2019.2938861 10.1145/3298981 10.1109/WCNC45663.2020.9120713 10.1109/JSAC.2019.2904348 10.1109/TAC.2014.2308612 10.1609/aaai.v31i1.10894 10.1109/TCOMM.2020.2979149 10.1109/TNSE.2020.3016035 10.1109/TGRS.2019.2961141 10.1109/TCOMM.2021.3081746 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| DBID | 97E RIA RIE AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
| DOI | 10.1109/TNSE.2022.3227287 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
| DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
| DatabaseTitleList | Computer and Information Systems Abstracts |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2334-329X |
| EndPage | 11 |
| ExternalDocumentID | 10_1109_TNSE_2022_3227287 9973351 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: Fundamental Research Funds for the Central Universities grantid: WK3500000007 – fundername: National Key Research and Development Program of China grantid: 2018YFA0701603 – fundername: National Natural Science Foundation of China grantid: 62071445 |
| GroupedDBID | 0R~ 6IK 97E AAJGR AASAJ AAWTH ABJNI ABQJQ ABVLG AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS IFIPE IPLJI JAVBF M43 OCL PQQKQ RIA RIE AAYXX CITATION 7SC 8FD ABAZT JQ2 L7M L~C L~D |
| ID | FETCH-LOGICAL-c293t-72818bfeeb19ea9795f8ca72110ef633000f800ebdab7972d70b62b6f124e3a33 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 9 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000979667300025&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2327-4697 |
| IngestDate | Mon Nov 10 02:48:14 EST 2025 Sat Nov 29 04:55:54 EST 2025 Tue Nov 18 22:05:36 EST 2025 Tue Nov 25 14:44:28 EST 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 3 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c293t-72818bfeeb19ea9795f8ca72110ef633000f800ebdab7972d70b62b6f124e3a33 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0001-8083-1805 0000-0003-2569-745X 0000-0002-1754-0607 0000-0002-3550-0625 |
| PQID | 2806216998 |
| PQPubID | 2040409 |
| PageCount | 11 |
| ParticipantIDs | crossref_citationtrail_10_1109_TNSE_2022_3227287 proquest_journals_2806216998 ieee_primary_9973351 crossref_primary_10_1109_TNSE_2022_3227287 |
| PublicationCentury | 2000 |
| PublicationDate | 2023-05-01 |
| PublicationDateYYYYMMDD | 2023-05-01 |
| PublicationDate_xml | – month: 05 year: 2023 text: 2023-05-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | Piscataway |
| PublicationPlace_xml | – name: Piscataway |
| PublicationTitle | IEEE transactions on network science and engineering |
| PublicationTitleAbbrev | TNSE |
| PublicationYear | 2023 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | Konenỳ (ref4) 2016 ref13 Li (ref24) 2020 ref35 ref12 ref34 ref15 ref14 ref36 ref33 ref10 McMahan (ref2) 2017 Mnih (ref19) 2012 ref1 ref16 ref18 Ren (ref23) 2018 Wang (ref11) 2020; 33 Srivastava (ref17) 2014; 15 Arazo (ref30) 2019 ref26 ref25 Konenỳ (ref32) 2015 ref20 ref21 Goodfellow (ref9) 2016 Zhang (ref22) 2018 ref28 ref27 ref29 ref8 ref7 Yu (ref31) 2019 ref3 ref6 ref5 |
| References_xml | – ident: ref15 doi: 10.1561/9781680837896 – volume: 15 start-page: 1929 issue: 1 year: 2014 ident: ref17 article-title: Dropout: A simple way to prevent neural networks from overfitting publication-title: J. Mach. Learn. Res. – ident: ref16 doi: 10.1109/ICCV.2015.168 – ident: ref14 doi: 10.1109/TWC.2020.2974748 – ident: ref7 doi: 10.1109/TWC.2021.3099505 – ident: ref36 doi: 10.1109/CVPR.2016.90 – ident: ref10 doi: 10.1109/JIOT.2020.3023126 – start-page: 567 volume-title: Proc. 29th Int. Conf. Mach. Learn. year: 2012 ident: ref19 article-title: Learning to label aerial images from noisy data – ident: ref6 doi: 10.1109/TNSE.2020.3014385 – start-page: 312 volume-title: Proc. Int. Conf. Mach. Learn. year: 2019 ident: ref30 article-title: Unsupervised label noise modeling and loss correction – ident: ref27 doi: 10.1109/MWC.001.1900119 – ident: ref28 doi: 10.1109/TNNLS.2022.3152527 – ident: ref18 doi: 10.1109/CVPR.2015.7298885 – volume-title: Deep Learning year: 2016 ident: ref9 – volume: 33 start-page: 16070 year: 2020 ident: ref11 article-title: Attack of the tails: Yes, you really can backdoor federated learning publication-title: Proc. Neural Inf. Process. Syst. – ident: ref26 doi: 10.1109/JIOT.2020.3027980 – ident: ref1 doi: 10.1109/MCOM.2019.1900271 – ident: ref35 doi: 10.1007/s10915-018-0757-z – ident: ref25 doi: 10.1109/TII.2019.2938861 – ident: ref3 doi: 10.1145/3298981 – year: 2015 ident: ref32 article-title: Federated optimization: Distributed optimization beyond the datacenter – ident: ref13 doi: 10.1109/WCNC45663.2020.9120713 – ident: ref33 doi: 10.1109/JSAC.2019.2904348 – ident: ref34 doi: 10.1109/TAC.2014.2308612 – ident: ref21 doi: 10.1609/aaai.v31i1.10894 – ident: ref8 doi: 10.1109/TCOMM.2020.2979149 – ident: ref5 doi: 10.1109/TNSE.2020.3016035 – start-page: 8778 volume-title: Proc. Neural Inf. Process. Syst. year: 2018 ident: ref22 article-title: Generalized cross entropy loss for training deep neural networks with noisy labels – start-page: 4334 volume-title: Proc. Int. Conf. Mach. Learn. year: 2018 ident: ref23 article-title: Learning to reweight examples for robust deep learning – year: 2016 ident: ref4 article-title: Federated learning: Strategies for improving communication efficiency – volume-title: Proc. Int. Conf. Learn. Representations year: 2020 ident: ref24 article-title: Dividemix: Learning with noisy labels as semi-supervised learning – ident: ref20 doi: 10.1109/TGRS.2019.2961141 – ident: ref12 doi: 10.1109/TCOMM.2021.3081746 – ident: ref29 doi: 10.1561/9781680837896 – start-page: 7164 volume-title: Proc. Int. Conf. Mach. Learn. year: 2019 ident: ref31 article-title: How does disagreement help generalization against label corruption? – start-page: 1273 volume-title: Proc. Artif. Intell. Statist. year: 2017 ident: ref2 article-title: Communication-efficient learning of deep networks from decentralized data |
| SSID | ssj0001286333 |
| Score | 2.2916958 |
| Snippet | Federated learning (FL) is a communication-efficient machine learning paradigm to leverage distributed data at the network edge. Nevertheless, FL usually fails... |
| SourceID | proquest crossref ieee |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 1 |
| SubjectTerms | Convergence Data models Datasets Distributed networks Federated learning label noise Loss measurement Machine learning Nodes Noise measurement non-convex optimization parallel and distributed algorithms robust design Robustness Servers Training |
| Title | Robust Federated Learning With Noisy Labeled Data Through Loss Function Correction |
| URI | https://ieeexplore.ieee.org/document/9973351 https://www.proquest.com/docview/2806216998 |
| Volume | 10 |
| WOSCitedRecordID | wos000979667300025&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2334-329X dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0001286333 issn: 2327-4697 databaseCode: RIE dateStart: 20140101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NS8MwFA9zeNCDX1OcTsnBk9itTdamOcq0eBhF5sTdStImOpBN1k7wv_clzcZAEbzlkEB4vyTvK-_3ELpSXIERzrlHhfK9fki0J304y0UQUBlrEtDAFgoPWZrGkwl_bKCbdS2MUsp-PlNdM7S5_GKeL02orMc5o9TUS28xxuparY14ShxRSl3iMvB5b5w-3YMDSEgXDi0j5tPchuqxvVR-PMBWqyT7_9vPAdpz1iO-reE-RA01O0K7G5yCLTQazeWyrHBiWCLAkCywo1B9xS_T6g2n82n5hYdCgr4p8J2oBB7XvXrwEHaNE1B0Biw8MH077PAYPSf348GD5xoneDlo78pjhuJJagXvMFeCMx7qOBfW11MapATi0mAoKlkIyTgjBfNlRGSkQdkrKig9Qc3ZfKZOETZ88ESGnPZZ3tdhJHUYExmEuQRPLGSijfyVTLPcsYqb5hbvmfUufJ4ZGDIDQ-ZgaKPr9ZKPmlLjr8ktI_f1RCfyNuqsgMvcpSszkyQmQQQO5Nnvq87RjukWX0dQOqhZLZbqAm3nn9W0XFza8_QNN0_H-A |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fS8MwED7GFNQHf01xOjUPPond2qRtmkeZjom1iE70rSRtogPZZOsE_3uTtBsDRfAtDwmEuyR3l7v7PoAzyaR2whlzCJeu4wdYOcLVZzn3PCIihT3i2UbhmCZJ9PLC7mtwseiFkVLa4jPZNkOby8_H2cx8lXUYo4SYfumVwPexV3ZrLf2oRCEhpEpdei7rDJLHax0CYtzWx5ZiUza3ZHwsm8qPJ9jald7W_3a0DZuV_4guS4XvQE2OdmFjCVWwAQ8PYzGbFqhncCK0K5mjCkT1FT0PizeUjIfTLxRzoS1Ojq54wdGgZOtBsd416mlTZ9SFuoa5ww734Kl3Pej2nYo6wcm0_S4cakCehJL6JWaSM8oCFWXcRntSaSlpcSntKkqRc0EZxTl1RYhFqLS5l4QTsg_10XgkDwAZRHgsAkZ8mvkqCIUKIiy8IBM6Fgsob4I7l2maVbjiht7iPbXxhctSo4bUqCGt1NCE88WSjxJU46_JDSP3xcRK5E1ozRWXVtdumpo0MfZCHUIe_r7qFNb6g7s4jW-S2yNYN9zxZfViC-rFZCaPYTX7LIbTyYk9W9_ItctD |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Robust+Federated+Learning+With+Noisy+Labeled+Data+Through+Loss+Function+Correction&rft.jtitle=IEEE+transactions+on+network+science+and+engineering&rft.au=Chen%2C+Li&rft.au=Ang%2C+Fan&rft.au=Chen%2C+Yunfei&rft.au=Wang%2C+Weidong&rft.date=2023-05-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.eissn=2334-329X&rft.volume=10&rft.issue=3&rft.spage=1501&rft_id=info:doi/10.1109%2FTNSE.2022.3227287&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2327-4697&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2327-4697&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2327-4697&client=summon |