Adaptive Gradient Sparsification for Efficient Federated Learning: An Online Learning Approach
Federated learning (FL) is an emerging technique for training machine learning models using geographically dispersed data collected by local entities. It includes local computation and synchronization steps. To reduce the communication overhead and improve the overall efficiency of FL, gradient spar...
Saved in:
| Published in: | Proceedings of the International Conference on Distributed Computing Systems pp. 300 - 310 |
|---|---|
| Main Authors: | , , |
| Format: | Conference Proceeding |
| Language: | English |
| Published: |
IEEE
01.11.2020
|
| Subjects: | |
| ISSN: | 2575-8411 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Abstract | Federated learning (FL) is an emerging technique for training machine learning models using geographically dispersed data collected by local entities. It includes local computation and synchronization steps. To reduce the communication overhead and improve the overall efficiency of FL, gradient sparsification (GS) can be applied, where instead of the full gradient, only a small subset of important elements of the gradient is communicated. Existing work on GS uses a fixed degree of gradient sparsity for i.i.d.-distributed data within a datacenter. In this paper, we consider adaptive degree of sparsity and non-i.i.d. local datasets. We first present a fairness-aware GS method which ensures that different clients provide a similar amount of updates. Then, with the goal of minimizing the overall training time, we propose a novel online learning formulation and algorithm for automatically determining the near-optimal communication and computation trade-off that is controlled by the degree of gradient sparsity. The online learning algorithm uses an estimated sign of the derivative of the objective function, which gives a regret bound that is asymptotically equal to the case where exact derivative is available. Experiments with real datasets confirm the benefits of our proposed approaches, showing up to 40% improvement in model accuracy for a finite training time. |
|---|---|
| AbstractList | Federated learning (FL) is an emerging technique for training machine learning models using geographically dispersed data collected by local entities. It includes local computation and synchronization steps. To reduce the communication overhead and improve the overall efficiency of FL, gradient sparsification (GS) can be applied, where instead of the full gradient, only a small subset of important elements of the gradient is communicated. Existing work on GS uses a fixed degree of gradient sparsity for i.i.d.-distributed data within a datacenter. In this paper, we consider adaptive degree of sparsity and non-i.i.d. local datasets. We first present a fairness-aware GS method which ensures that different clients provide a similar amount of updates. Then, with the goal of minimizing the overall training time, we propose a novel online learning formulation and algorithm for automatically determining the near-optimal communication and computation trade-off that is controlled by the degree of gradient sparsity. The online learning algorithm uses an estimated sign of the derivative of the objective function, which gives a regret bound that is asymptotically equal to the case where exact derivative is available. Experiments with real datasets confirm the benefits of our proposed approaches, showing up to 40% improvement in model accuracy for a finite training time. |
| Author | Han, Pengchao Leung, Kin K. Wang, Shiqiang |
| Author_xml | – sequence: 1 givenname: Pengchao surname: Han fullname: Han, Pengchao email: hanpengchao199@gmail.com organization: Imperial College London,Department of Electrical and Electronic Engineering,UK – sequence: 2 givenname: Shiqiang surname: Wang fullname: Wang, Shiqiang email: wangshiq@us.ibm.com organization: IBM T. J. Watson Research Center,Yorktown Heights,NY,USA – sequence: 3 givenname: Kin K. surname: Leung fullname: Leung, Kin K. email: kin.leung@imperial.ac.uk organization: Imperial College London,Department of Electrical and Electronic Engineering,UK |
| BookMark | eNo9jNtKw0AURUdRsKl-gQjzA6lzydx8C7GthUIfqq-W08wZHamTMAmCf9-i4tNm7bXZBblIXUJC7jibcc7c_ap5bLaVMaaaCSbYjDEm9BkpuBGWmxPYczIRyqjSVpxfkWIYPk4bZbWckNfaQz_GL6TLDD5iGum2hzzEEFsYY5do6DKdhxP-yAV6zDCip2uEnGJ6e6B1opt0iAn_O1r3fe6gfb8mlwEOA9785ZS8LObPzVO53ixXTb0uo2ByLKtq76UQ6FSQAcBpJ3XFvXPaokQrggQvFQRlvJZ279meW-ZYax20XgQtp-T29zci4q7P8RPy985JpYwz8giuYlYj |
| CODEN | IEEPAD |
| ContentType | Conference Proceeding |
| DBID | 6IE 6IH CBEJK RIE RIO |
| DOI | 10.1109/ICDCS47774.2020.00026 |
| DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Proceedings Order Plan (POP) 1998-present by volume IEEE Xplore All Conference Proceedings IEEE Electronic Library (IEL) IEEE Proceedings Order Plans (POP) 1998-present |
| DatabaseTitleList | |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISBN | 1728170028 9781728170022 |
| EISSN | 2575-8411 |
| EndPage | 310 |
| ExternalDocumentID | 9355797 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: Ministry of Defence funderid: 10.13039/100009941 |
| GroupedDBID | 23M 29G 29P 6IE 6IF 6IH 6IK 6IL 6IM 6IN AAJGR AAWTH ABLEC ACGFS ADZIZ ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CBEJK CHZPO IEGSK IJVOP IPLJI M43 OCL RIE RIL RIO RNS |
| ID | FETCH-LOGICAL-i203t-44bd322e95f3faa9693641d9968e3e82f3ad35af57d638bd0b18090c89acd2f63 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 127 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000667971400028&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| IngestDate | Wed Aug 27 02:41:14 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | true |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-i203t-44bd322e95f3faa9693641d9968e3e82f3ad35af57d638bd0b18090c89acd2f63 |
| PageCount | 11 |
| ParticipantIDs | ieee_primary_9355797 |
| PublicationCentury | 2000 |
| PublicationDate | 2020-Nov. |
| PublicationDateYYYYMMDD | 2020-11-01 |
| PublicationDate_xml | – month: 11 year: 2020 text: 2020-Nov. |
| PublicationDecade | 2020 |
| PublicationTitle | Proceedings of the International Conference on Distributed Computing Systems |
| PublicationTitleAbbrev | ICDCS |
| PublicationYear | 2020 |
| Publisher | IEEE |
| Publisher_xml | – name: IEEE |
| SSID | ssj0005863 |
| Score | 2.561142 |
| Snippet | Federated learning (FL) is an emerging technique for training machine learning models using geographically dispersed data collected by local entities. It... |
| SourceID | ieee |
| SourceType | Publisher |
| StartPage | 300 |
| SubjectTerms | Collaborative work Distributed machine learning edge computing federated learning Finite element analysis gradient sparsification Minimization online learning Optimization Privacy Synchronization Training |
| Title | Adaptive Gradient Sparsification for Efficient Federated Learning: An Online Learning Approach |
| URI | https://ieeexplore.ieee.org/document/9355797 |
| WOSCitedRecordID | wos000667971400028&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV09T8MwED21FQNTgRbxLQ-MhCZ2EttsVWmBpapUkDpR-RN1SauQ8vuxndAysLBFTqJIZ539znnvHcCtQ8kS25RGKk5slCq3DnKciIhowam3zBKahWYTdDpliwWfteBup4UxxgTymbn3l-Ffvl6rrT8qG3gvcMppG9qU0lqrtadzsJw0Cp0k5oOX0eNonrrH_LkJ9vSt4J_wq4NK2EAm3f99-gj6eyUemu32mGNomeIEuj-tGFCTmT14H2qx8SsXeioDiatC842rWT0RKMQeOXCKxsEvwt-ceA8JBzM1agxWPx7QsEC18ehuDA0bx_E-vE3Gr6PnqGmdEK1wTKooTaV2qWp4ZokVguec5GmiXXHDDDEMWyI0yYTNqHYJKHUsvY9XrBgXSmObk1PoFOvCnAFybxPMHKzBrhLUggmVS5MlUjqgobJYnEPPh2u5qd0xlk2kLv4evoRDPx-1mu8KOlW5NddwoL6q1Wd5E6b0G265ouw |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PT8IwFH5BNNETKhh_24NHJ1vbba03giBEJCRgwknStR3hMggO_37bboIHL96WLsuS17z2e-33fQ_g3qDkBKc09qQfpB6VZh3kOBAeUYLH1jJLKOaaTcTDIZtO-agCD1stjNbakc_0o310d_lqKTf2qKxpvcBjHu_BfkgpDgq11o7QwSJSanQCnzf77ef2mMYG3pgqEFsCl3NQ-NVDxW0h3dr_fn4MjZ0WD422u8wJVHR2CrWfZgyozM06fLSUWNm1C72sHY0rR-OVqVotFchFHxl4ijrOMcK-7FoXCQM0FSotVudPqJWhwnp0O4Zaped4A967nUm755XNE7wF9knuUZook6yahylJheARJxENlClvmCaa4ZQIRUKRhrEyKZgoP7FOXr5kXEiF04icQTVbZvockPmaYGaADTa1oBJMyCjRYZAkBmrI0BcXULfhmq0Kf4xZGanLv4fv4LA3eRvMBv3h6xUc2bkptH3XUM3XG30DB_IrX3yub930fgMVyaYz |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=proceeding&rft.title=Proceedings+of+the+International+Conference+on+Distributed+Computing+Systems&rft.atitle=Adaptive+Gradient+Sparsification+for+Efficient+Federated+Learning%3A+An+Online+Learning+Approach&rft.au=Han%2C+Pengchao&rft.au=Wang%2C+Shiqiang&rft.au=Leung%2C+Kin+K.&rft.date=2020-11-01&rft.pub=IEEE&rft.eissn=2575-8411&rft.spage=300&rft.epage=310&rft_id=info:doi/10.1109%2FICDCS47774.2020.00026&rft.externalDocID=9355797 |