Faster Convergence With Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning Over Wireless Networks

Decentralized stochastic gradient descent (D-SGD) is a widely adopted optimization algorithm for decentralized training of machine learning models across networked agents. A crucial part of D-SGD is the consensus-based model averaging, which heavily relies on information exchange and fusion among th...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE open journal of the Communications Society Ročník 6; s. 1497 - 1511
Hlavní autoři: Perez Herrera, Daniel, Chen, Zheng, Larsson, Erik G.
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2644-125X, 2644-125X
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Decentralized stochastic gradient descent (D-SGD) is a widely adopted optimization algorithm for decentralized training of machine learning models across networked agents. A crucial part of D-SGD is the consensus-based model averaging, which heavily relies on information exchange and fusion among the nodes. For consensus averaging over wireless networks, due to the broadcast nature of wireless channels, simultaneous transmissions from multiple nodes may cause packet collisions if they share a common receiver. Therefore, communication coordination is necessary to determine when and how a node can transmit (or receive) information to (or from) its neighbors. In this work, we propose BASS , a broadcast-based subgraph sampling method designed to accelerate the convergence of D-SGD while considering the actual communication cost per iteration. BASS creates a set of mixing matrix candidates that represent sparser subgraphs of the base topology. In each consensus iteration, one mixing matrix is randomly sampled, leading to a specific scheduling decision that activates multiple collision-free subsets of nodes. The sampling occurs in a probabilistic manner, and the elements of the mixing matrices, along with their sampling probabilities, are jointly optimized. Simulation results demonstrate that BASS achieves faster convergence and requires fewer transmission slots than existing link-based scheduling methods and the full communication scenario. In conclusion, the inherent broadcasting nature of wireless channels offers intrinsic advantages in accelerating the convergence of decentralized optimization and learning.
AbstractList Decentralized stochastic gradient descent (D-SGD) is a widely adopted optimization algorithm for decentralized training of machine learning models across networked agents. A crucial part of D-SGD is the consensus-based model averaging, which heavily relies on information exchange and fusion among the nodes. For consensus averaging over wireless networks, due to the broadcast nature of wireless channels, simultaneous transmissions from multiple nodes may cause packet collisions if they share a common receiver. Therefore, communication coordination is necessary to determine when and how a node can transmit (or receive) information to (or from) its neighbors. In this work, we propose BASS , a broadcast-based subgraph sampling method designed to accelerate the convergence of D-SGD while considering the actual communication cost per iteration. BASS creates a set of mixing matrix candidates that represent sparser subgraphs of the base topology. In each consensus iteration, one mixing matrix is randomly sampled, leading to a specific scheduling decision that activates multiple collision-free subsets of nodes. The sampling occurs in a probabilistic manner, and the elements of the mixing matrices, along with their sampling probabilities, are jointly optimized. Simulation results demonstrate that BASS achieves faster convergence and requires fewer transmission slots than existing link-based scheduling methods and the full communication scenario. In conclusion, the inherent broadcasting nature of wireless channels offers intrinsic advantages in accelerating the convergence of decentralized optimization and learning.
Author Perez Herrera, Daniel
Larsson, Erik G.
Chen, Zheng
Author_xml – sequence: 1
  givenname: Daniel
  orcidid: 0000-0002-6097-7935
  surname: Perez Herrera
  fullname: Perez Herrera, Daniel
  organization: Department of Electrical Engineering, Linköping University, Linköping, Sweden
– sequence: 2
  givenname: Zheng
  orcidid: 0000-0001-5621-2860
  surname: Chen
  fullname: Chen, Zheng
  email: zheng.chen@liu.se
  organization: Department of Electrical Engineering, Linköping University, Linköping, Sweden
– sequence: 3
  givenname: Erik G.
  orcidid: 0000-0002-7599-4367
  surname: Larsson
  fullname: Larsson, Erik G.
  organization: Department of Electrical Engineering, Linköping University, Linköping, Sweden
BackLink https://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-212401$$DView record from Swedish Publication Index (Linköpings universitet)
BookMark eNpVkc1u1DAUhS1UJMrQJ4BFJNaZ-vovCbt2SqFoYBbD385ynJuph0wc7ISqbHh1kkmFysrW9Xc-Xes8Jyetb5GQl0CXALQ433xYbT5ul4wyueRSUOD8CTllSogUmPx-8uj-jJzFuKd0RAGAi1Py59rEHkOy8u0vDDtsLSbfXH-brDHGcXo4DK2zpne-fZNcBm8qOwbSSxOxSrZDuQumu0225tA1rt0ltQ_JFVps-2Aa93tk1mhCOz1tRv-oDthM5k_Y3_nwI74gT2vTRDx7OBfky_Xbz6v36Xrz7mZ1sU4tL4o-FViiUrxkTPEailyWWaGAM2o4FRUImxWVKnOwVEqoa1UJK0pJS5UrZXMEviA3s7fyZq-74A4m3GtvnD4OfNhpE3pnG9RQKVNACZIpKZisc14JUY17gFU58HJ0pbMr3mE3lP_ZrtzXi6OtcYNmwKY6FuT1zHfB_xww9nrvh9CO39UcMgGZ5EeKz5QNPsaA9T8vUD0Vreei9VS0fih6TL2aUw4RHyXyrKA55X8BxaemZQ
CODEN IOJCAZ
Cites_doi 10.1109/LSP.2023.3240647
10.1109/TWC.2022.3172147
10.1109/SECONWorkshops56311.2022.9926391
10.1109/TAC.2008.2009515
10.1109/FOCS46700.2020.00089
10.1090/conm/352
10.1109/LCSYS.2024.3487796
10.1017/CBO9780511804441
10.1109/TAC.2010.2079650
10.1137/060676866
10.1016/j.sysconle.2004.02.022
10.1109/TSP.2024.3363887
10.1109/CDC.2018.8619228
10.1109/SPAWC53906.2023.10304514
10.1109/JPROC.2020.3024266
10.1109/ACC.2016.7526803
10.1109/ICC51166.2024.10622904
10.1109/TSP.2022.3212536
10.1109/JSAC.2021.3118400
10.1109/JSTSP.2022.3152445
10.1137/060678324
10.4171/022-3/63
10.1109/JSAC.2023.3242735
10.1109/TWC.2023.3271854
10.1109/TSIPN.2022.3205549
10.1109/TAC.2014.2364096
10.1017/9781316216002
10.1109/CDC49753.2023.10383226
10.1109/JPROC.2018.2817461
10.1007/978-1-4612-0619-4
10.1109/JPROC.2006.887293
10.1109/JSAIT.2021.3103920
10.1109/TCCN.2021.3074908
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
ABXSW
ADTPV
AOWAS
D8T
DG8
ZZAVC
DOA
DOI 10.1109/OJCOMS.2025.3540133
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
SWEPUB Linköpings universitet full text
SwePub
SwePub Articles
SWEPUB Freely available online
SWEPUB Linköpings universitet
SwePub Articles full text
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList
Technology Research Database


Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: RIE
  name: IEEE Xplore Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2644-125X
EndPage 1511
ExternalDocumentID oai_doaj_org_article_1d6a91b15265425f83d44dc391c6813b
oai_DiVA_org_liu_212401
10_1109_OJCOMS_2025_3540133
10879080
Genre orig-research
GrantInformation_xml – fundername: Swedish Research Council (VR)
– fundername: Zenith, ELLIIT, Knut and Alice Wallenberg Foundation
GroupedDBID 0R~
97E
AAJGR
ABAZT
ABVLG
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
ESBDL
GROUPED_DOAJ
JAVBF
M~E
OCL
OK1
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
ABXSW
ADTPV
AOWAS
D8T
DG8
ZZAVC
ID FETCH-LOGICAL-c399t-4ebe663b2263f1985b7961320a304d14c79d6b81c0551ff6d4c4b50b6866c8e13
IEDL.DBID RIE
ISICitedReferencesCount 0
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001438165700004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2644-125X
IngestDate Fri Oct 03 12:53:55 EDT 2025
Tue Nov 04 16:32:11 EST 2025
Mon Jun 30 12:15:16 EDT 2025
Sat Nov 29 07:59:38 EST 2025
Wed Apr 23 05:41:09 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords broadcast
Costs
Symmetric matrices
Laplace equations
Stochastic processes
wireless networks
Vectors
Topology
node scheduling
Optimization
Convergence
Training
Network topology
D-SGD
Decentralized machine learning
Language English
License https://creativecommons.org/licenses/by-nc-nd/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c399t-4ebe663b2263f1985b7961320a304d14c79d6b81c0551ff6d4c4b50b6866c8e13
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-7599-4367
0000-0002-6097-7935
0000-0001-5621-2860
OpenAccessLink https://ieeexplore.ieee.org/document/10879080
PQID 3174175301
PQPubID 5075783
PageCount 15
ParticipantIDs doaj_primary_oai_doaj_org_article_1d6a91b15265425f83d44dc391c6813b
swepub_primary_oai_DiVA_org_liu_212401
crossref_primary_10_1109_OJCOMS_2025_3540133
ieee_primary_10879080
proquest_journals_3174175301
PublicationCentury 2000
PublicationDate 20250000
2025-00-00
20250101
2025
2025-01-01
PublicationDateYYYYMMDD 2025-01-01
PublicationDate_xml – year: 2025
  text: 20250000
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE open journal of the Communications Society
PublicationTitleAbbrev OJCOMS
PublicationYear 2025
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref35
ref34
ref37
ref36
ref31
ref30
ref11
ref33
ref10
ref32
ref1
McMahan (ref2)
Scaman (ref12); 31
ref17
ref39
ref38
ref19
ref18
Lian (ref4); 30
Alistarh (ref26); 30
Krizhevsky (ref47) 2009
Lu (ref24)
ref46
Wu (ref21) 2023
ref23
Neglia (ref14)
Tang (ref25); 31
ref48
ref20
ref42
Koloskova (ref16)
ref41
ref22
ref44
ref43
Vogels (ref15); 35
ref28
ref27
ref29
ref7
Swenson (ref3) 2020
ref6
ref5
Koloskova (ref9)
LeCun (ref45) 2010
ref40
Wang (ref8) 2021; 22
References_xml – ident: ref43
  doi: 10.1109/LSP.2023.3240647
– volume: 22
  start-page: 9709
  issue: 1
  year: 2021
  ident: ref8
  article-title: Cooperative SGD: A unified framework for the design and analysis of local-update SGD algorithms
  publication-title: J. Mach. Learn. Res.
– ident: ref19
  doi: 10.1109/TWC.2022.3172147
– year: 2009
  ident: ref47
  article-title: Learning multiple layers of features from tiny images
– ident: ref20
  doi: 10.1109/SECONWorkshops56311.2022.9926391
– ident: ref11
  doi: 10.1109/TAC.2008.2009515
– ident: ref38
  doi: 10.1109/FOCS46700.2020.00089
– volume: 30
  start-page: 1
  volume-title: Proc. 31st Conf. Neural Inf. Process. Syst.
  ident: ref4
  article-title: Can decentralized algorithms outperform centralized algorithms? A case study for decentralized parallel stochastic gradient descent
– volume-title: MNIST handwritten digit database
  year: 2010
  ident: ref45
– ident: ref37
  doi: 10.1090/conm/352
– volume: 30
  start-page: 1707
  volume-title: Proc. 31st Conf. Neural Inf. Process. Syst.
  ident: ref26
  article-title: QSGD: Communication-efficient SGD via gradient quantization and encoding
– volume: 31
  start-page: 1
  volume-title: Proc. 32nd Conf. Neural Inf. Process. Syst.
  ident: ref25
  article-title: Communication compression for decentralized training
– ident: ref41
  doi: 10.1109/LCSYS.2024.3487796
– ident: ref48
  doi: 10.1017/CBO9780511804441
– ident: ref32
  doi: 10.1109/TAC.2010.2079650
– ident: ref46
  doi: 10.1137/060676866
– start-page: 1273
  volume-title: Proc. 20th Int. Conf. Artif. Intell. Statist.
  ident: ref2
  article-title: Communication-efficient learning of deep networks from decentralized data
– ident: ref6
  doi: 10.1016/j.sysconle.2004.02.022
– ident: ref40
  doi: 10.1109/TSP.2024.3363887
– year: 2020
  ident: ref3
  article-title: Distributed stochastic gradient descent: Nonconvexity, nonsmoothness, and convergence to local minima
  publication-title: arXiv:2003.02818
– ident: ref10
  doi: 10.1109/CDC.2018.8619228
– ident: ref35
  doi: 10.1109/SPAWC53906.2023.10304514
– ident: ref33
  doi: 10.1109/JPROC.2020.3024266
– ident: ref42
  doi: 10.1109/ACC.2016.7526803
– volume: 31
  start-page: 1
  volume-title: Proc. 32nd Conf. Neural Inf. Process. Syst.
  ident: ref12
  article-title: Optimal algorithms for non-smooth distributed optimization in networks
– volume: 35
  start-page: 15039
  volume-title: Proc. 36th Conf. Neural Inf. Process. Syst.
  ident: ref15
  article-title: Beyond spectral gap: The role of the topology in decentralized learning
– ident: ref1
  doi: 10.1109/ICC51166.2024.10622904
– ident: ref27
  doi: 10.1109/TSP.2022.3212536
– ident: ref34
  doi: 10.1109/JSAC.2021.3118400
– ident: ref18
  doi: 10.1109/JSTSP.2022.3152445
– ident: ref7
  doi: 10.1137/060678324
– start-page: 2348
  volume-title: Proc. Int. Conf. Artif. Intell. Statist.
  ident: ref14
  article-title: Decentralized gradient methods: Does topology matter?
– ident: ref39
  doi: 10.4171/022-3/63
– start-page: 5381
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref9
  article-title: A unified theory of decentralized SGD with changing topology and local updates
– ident: ref28
  doi: 10.1109/JSAC.2023.3242735
– start-page: 6415
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref24
  article-title: Moniqua: Modulo quantized communication in decentralized SGD
– ident: ref29
  doi: 10.1109/TWC.2023.3271854
– ident: ref23
  doi: 10.1109/TSIPN.2022.3205549
– ident: ref31
  doi: 10.1109/TAC.2014.2364096
– ident: ref44
  doi: 10.1017/9781316216002
– ident: ref17
  doi: 10.1109/CDC49753.2023.10383226
– start-page: 3478
  volume-title: Proc. Int. Conf. Mach. Learn.
  ident: ref16
  article-title: Decentralized stochastic optimization and gossip algorithms with compressed communication
– year: 2023
  ident: ref21
  article-title: Asynchronous distributed optimization with delay-free parameters
  publication-title: arXiv:2312.06508
– ident: ref13
  doi: 10.1109/JPROC.2018.2817461
– ident: ref36
  doi: 10.1007/978-1-4612-0619-4
– ident: ref5
  doi: 10.1109/JPROC.2006.887293
– ident: ref30
  doi: 10.1109/JSAIT.2021.3103920
– ident: ref22
  doi: 10.1109/TCCN.2021.3074908
SSID ssj0002511134
Score 2.2924466
Snippet Decentralized stochastic gradient descent (D-SGD) is a widely adopted optimization algorithm for decentralized training of machine learning models across...
SourceID doaj
swepub
proquest
crossref
ieee
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Publisher
StartPage 1497
SubjectTerms Algorithms
broadcast
Broadcasting
Channels
Collision avoidance
Communication
Convergence
Costs
D-SGD
Decentralized machine learning
Graph theory
Laplace equations
Machine learning
Network topology
node scheduling
Nodes
Optimization
Sampling methods
Scheduling
Stochastic processes
Symmetric matrices
Topology
Training
Vectors
Wireless networks
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELZQxQEOiEcRgVL5gDgRmokdx-bWblkhBLtI5dGbFT9SIlUL6m459NK_zozjVtkTF66RYzv-Jp5vrPE3jL0KJvYgHZTBQFfKrmlKI5u6jFp6F1qBHsalYhPtYqFPT82XSakvygkb5YHHhTuAoDoDDkjGHe2r1yJIGbww4JUG4Wj3rVozCaZoDybiDEJmmSGozMHy42z5-QQDwrp5S2cdIMSWK0qK_bnEyjbbnCqIJq8zf8geZLrID8dpPmJ34uoxuz8REXzCrucdiR3wGeWPp6uUkf8YNj_5J9zE-NYFkHcco-4ueHyhPEL3FTjuG0mymp90lFq-OuNIYvlxzDmbwxW2yRKsZ3yJ_XNKlz2nnhdjAvl6l32bv_86-1DmsgolrprZlBJxQ57hkHiJHoxuXGsU3aTuRCUDSN-aoJwGXyGb6nsVpJeuqZzSSnkdQTxlO6tfq_iM8QCxim2N0bmqZMQ-G-dro0xdB3DKNQV7c7PC9veonmFT1FEZOwJiCRCbASnYEaFw25Skr9MDNAibDcL-yyAKtksYTsbTrUFeXLC9G1Bt_k_XFtmTJK3SCgr2egR6a_Tj4fthGv18uLTo5XGaz__HJF-we_Th41nOHtvZXFzGl-yu_7MZ1hf7yZb_Ag8L9YA
  priority: 102
  providerName: Directory of Open Access Journals
Title Faster Convergence With Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning Over Wireless Networks
URI https://ieeexplore.ieee.org/document/10879080
https://www.proquest.com/docview/3174175301
https://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-212401
https://doaj.org/article/1d6a91b15265425f83d44dc391c6813b
Volume 6
WOSCitedRecordID wos001438165700004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 2644-125X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002511134
  issn: 2644-125X
  databaseCode: DOA
  dateStart: 20200101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 2644-125X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0002511134
  issn: 2644-125X
  databaseCode: M~E
  dateStart: 20200101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB7RigMcoEBRA23lA-JESiZ2nJhbu-2qQnQXqTx6s-JHSqRqi7q7HDjAX2fsuNXugUMvURQlfujz45vJzGeAN075DoXB3Clsc9FWVa5EVea-Eda4mtMOY-JhE_Vk0lxcqM8pWT3mwnjvY_CZPwi38V--u7bL4CqjGd7UiijOBmzUtRySte4cKoErIxdJWQgL9X76cTQ9OycbsKwOgnsDOV_bfaJIfzpVZZ1groqGxo1m_PSeTdyCJ4lRssNhCDyDB372HB6v6Ay-gL_jNughsFEIMY_Zlp597xc_2Cda59hajsgHRoZ56yx9kB_RDucYLS1R1ZqdtyH6fHbJiOeyY5_COvvf9E5Sab1kUyqfhYjaq1DyZIgxn2_D1_HJl9Fpnk5eyC0RlkUuCFqiIoa4Ge9QNZWplQzJ1i0vhENha-WkadAWRLi6TjphhakKIxspbeORv4TN2fXM7wBz6Atfl2TAy0J4KrMytlRSlaVDI02VwbtbRPTPQWBDR8OkUHoAUAcAdQIwg6OA2t2rQR07PiAUdJpsGp1sFRoM0v-0JnUNd0I46hpa2SA3GWwH5FbqG0DLYPd2EOg0leeaCJYIcqYFZvB2GBhrtR_33w5j7Vf9UhMRoGa--k_5r-FR6MvgwdmFzcXN0u_BQ_tr0c9v9qM7gK5nf07249D-B9ZC9mA
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwEB1BiwQcKB9FDS3gA-JEWjt2nLi3dsuqwHYXqQV6s-KPtJGqLerucuDCX2fsuNXugQO3KHLGtp7teXZmngHeOeVbJgzLnWJNLpqyzJUoi9zXwhpXcfQwJl42UY3H9fm5-pqS1WMujPc-Bp_53fAY_-W7a7sIR2U4w-tKIcW5D-ulEAXt07XujlQCW2ZcJG0hRtXe5PNgcnKKu8Ci3A0HHIzzFf8TZfrTvSqrFHNZNjS6muHGfzbyKTxJnJIc9IPgGdzz0-fweElp8AX8GTZBEYEMQpB5zLf05Ec3vyQjXOnISpbIPsGteeMsfpAfoo9zBBeXqGtNTpsQfz69IMh0yZFPgZ3dbyyTdFovyATtkxBTexUsj_so89kmfBt-PBsc5-nuhdwiZZnnAsFFMmKQnfGWqbo0lZIh3brhVDgmbKWcNDWzFClX20onrDAlNbKW0tae8ZewNr2e-i0gjnnqqwK38JIKjzZLYwslVVE4ZqQpM_hwi4j-2Uts6Lg1oUr3AOoAoE4AZnAYULsrGvSx4wtEQafpppmTjWKGBfF_XJXamjshHHaNWVkzbjLYDMgt1deDlsHO7SDQaTLPNFIsEQRNKcvgfT8wVmo_6r4fxNqvuoVGKoDNfPUP-2_h4fHZyUiPPo2_bMOj0K_-PGcH1uY3C_8aHthf82528yYO7b8qcPeB
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Faster+Convergence+With+Less+Communication%3A+Broadcast-Based+Subgraph+Sampling+for+Decentralized+Learning+Over+Wireless+Networks&rft.jtitle=IEEE+open+journal+of+the+Communications+Society&rft.au=Perez+Herrera%2C+Daniel&rft.au=Chen%2C+Zheng&rft.au=Larsson%2C+Erik+G.&rft.date=2025&rft.pub=IEEE&rft.eissn=2644-125X&rft.volume=6&rft.spage=1497&rft.epage=1511&rft_id=info:doi/10.1109%2FOJCOMS.2025.3540133&rft.externalDocID=10879080
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2644-125X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2644-125X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2644-125X&client=summon