Coding for Large-Scale Distributed Machine Learning

This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distribut...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Entropy (Basel, Switzerland) Ročník 24; číslo 9; s. 1284
Hlavní autoři: Xiao, Ming, Skoglund, Mikael
Médium: Journal Article
Jazyk:angličtina
Vydáno: Basel MDPI AG 01.09.2022
MDPI
Témata:
ISSN:1099-4300, 1099-4300
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal–dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given.
AbstractList This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal–dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given.
This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal-dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given.This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal-dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given.
Audience Academic
Author Skoglund, Mikael
Xiao, Ming
AuthorAffiliation Division of Information Science and Engineering, Royal Institute of Technology, Malvinas Vag 10, KTH, 100-44 Stockholm, Sweden
AuthorAffiliation_xml – name: Division of Information Science and Engineering, Royal Institute of Technology, Malvinas Vag 10, KTH, 100-44 Stockholm, Sweden
Author_xml – sequence: 1
  givenname: Ming
  orcidid: 0000-0002-5407-0835
  surname: Xiao
  fullname: Xiao, Ming
– sequence: 2
  givenname: Mikael
  orcidid: 0000-0002-7926-5081
  surname: Skoglund
  fullname: Skoglund, Mikael
BackLink https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-319712$$DView record from Swedish Publication Index (Kungliga Tekniska Högskolan)
BookMark eNplkktv1DAUhSNUJNrCgn8QiQ0s0vqVON4gjaYUKg1iwWNr3djXGQ-ZuNgJiH9fT1MQU-SFrePvHF_b96w4GcOIRfGSkgvOFblEJoiirBVPilNKlKoEJ-Tkn_Wz4iylHSGMM9qcFnwdrB_70oVYbiD2WH02MGB55dMUfTdPaMuPYLZ-xHKDEMcMPy-eOhgSvniYz4uv1---rD9Um0_vb9arTWVqxqfKibqra2qpkRYMazvXGRAUOTRoO6Fcgw1V0DgJlgi0jFnk0mWYdNgRw8-LmyXXBtjp2-j3EH_rAF7fCyH2GuLkzYC6FWhqQIGOMyEMBUKhRpYV44DKJmdVS1b6hbdzd5R25b-t7tO-T1vNqZKUZf7twmd4j9bgOEUYjmzHO6Pf6j781EooqVqSA14_BMTwY8Y06b1PBocBRgxz0kzmstpWcJXRV4_QXZjjmJ_2QDU1k0KJTF0sVJ__R_vRhXyuycPi3pvcBc5nfSVFLYnijcyGy8VgYkgpotPGTzD5cCjYD5oSfWgZ_bdlsuPNI8ef-_7P3gGqZMLY
CitedBy_id crossref_primary_10_1109_JIOT_2024_3394714
crossref_primary_10_1109_TCCN_2024_3502495
crossref_primary_10_1186_s13634_025_01225_8
crossref_primary_10_3390_e24111604
crossref_primary_10_1109_TWC_2024_3366547
crossref_primary_10_3390_e25030392
Cites_doi 10.1109/ISIT.2018.8437467
10.1109/MSP.2020.3003845
10.1109/NoF52522.2021.9609913
10.1109/ISIT.2018.8437669
10.1561/2200000016
10.4310/CIS.2006.v6.n1.a3
10.1145/3286978.3286984
10.1017/CBO9780511804441
10.1145/2847220.2847223
10.1109/18.850663
10.1109/IWQoS49365.2020.9213028
10.1109/LCOMM.2019.2930513
10.1109/ISIT.2018.8437871
10.1109/TIT.2017.2756959
10.1109/JIOT.2021.3058116
10.1109/ISIT.2017.8006961
10.1109/TNET.2003.818197
10.1109/TSP.2021.3078625
10.1145/2408776.2408794
10.1109/TIT.2006.874390
10.1109/TIT.2017.2736066
10.1109/TIT.2011.2173631
10.1109/MCOM.2017.1600894
10.1109/TIT.2014.2334315
10.1109/TIT.2010.2090197
10.1109/JIOT.2018.2875057
10.1109/TSP.2020.3027917
10.1109/TMC.2019.2963668
10.1109/TCOMM.2014.2362111
10.1109/ISIT.2018.8437563
10.1145/285243.285258
10.1109/TII.2018.2857203
ContentType Journal Article
Copyright COPYRIGHT 2022 MDPI AG
2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2022 by the authors. 2022
Copyright_xml – notice: COPYRIGHT 2022 MDPI AG
– notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2022 by the authors. 2022
DBID AAYXX
CITATION
7TB
8FD
8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FR3
HCIFZ
KR7
L6V
M7S
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
7X8
5PM
ADTPV
AFDQA
AOWAS
D8T
D8V
ZZAVC
DOA
DOI 10.3390/e24091284
DatabaseName CrossRef
Mechanical & Transportation Engineering Abstracts
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central
Engineering Research Database
SciTech Premium
Civil Engineering Abstracts
ProQuest Engineering Collection
Engineering Database
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
MEDLINE - Academic
PubMed Central (Full Participant titles)
SwePub
SWEPUB Kungliga Tekniska Högskolan full text
SwePub Articles
SWEPUB Freely available online
SWEPUB Kungliga Tekniska Högskolan
SwePub Articles full text
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
Technology Collection
Technology Research Database
ProQuest One Academic Middle East (New)
Mechanical & Transportation Engineering Abstracts
ProQuest Central Essentials
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Engineering Collection
ProQuest Central Korea
ProQuest Central (New)
Engineering Collection
Civil Engineering Abstracts
Engineering Database
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest One Academic UKI Edition
Materials Science & Engineering Collection
Engineering Research Database
ProQuest One Academic
ProQuest One Academic (New)
MEDLINE - Academic
DatabaseTitleList Publicly Available Content Database
CrossRef



MEDLINE - Academic

Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: PIMPY
  name: Publicly Available Content Database
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
EISSN 1099-4300
ExternalDocumentID oai_doaj_org_article_84ec5ae4ef3244c1a01a5e2ae4cfa176
oai_DiVA_org_kth_319712
PMC9497980
A745709367
10_3390_e24091284
GeographicLocations Sweden
GeographicLocations_xml – name: Sweden
GrantInformation_xml – fundername: Swedish Research Council (VR)
  grantid: 2021-04772
GroupedDBID 29G
2WC
5GY
5VS
8FE
8FG
AADQD
AAFWJ
AAYXX
ABDBF
ABJCF
ACIWK
ACUHS
ADBBV
AEGXH
AENEX
AFFHD
AFKRA
AFPKN
AFZYC
ALMA_UNASSIGNED_HOLDINGS
BCNDV
BENPR
BGLVJ
CCPQU
CITATION
CS3
DU5
E3Z
ESX
F5P
GROUPED_DOAJ
GX1
HCIFZ
HH5
IAO
ITC
J9A
KQ8
L6V
M7S
MODMG
M~E
OK1
OVT
PGMZT
PHGZM
PHGZT
PIMPY
PQGLB
PROAC
PTHSS
RNS
RPM
TR2
TUS
XSB
~8M
7TB
8FD
ABUWG
AZQEC
DWQXO
FR3
KR7
PKEHL
PQEST
PQQKQ
PQUKI
PRINS
7X8
PUEGO
5PM
ADTPV
AFDQA
AOWAS
C1A
CH8
D8T
D8V
IPNFZ
RIG
ZZAVC
ID FETCH-LOGICAL-c523t-f45b551d1c7dac28bfbca41e3a6edb49f6e619a6f7ad04ed22de37f7da0beb0c3
IEDL.DBID M7S
ISICitedReferencesCount 6
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000858228600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1099-4300
IngestDate Fri Oct 03 12:53:08 EDT 2025
Tue Nov 04 16:58:40 EST 2025
Tue Nov 04 02:07:12 EST 2025
Thu Sep 04 17:14:20 EDT 2025
Fri Jul 25 11:53:25 EDT 2025
Tue Nov 04 18:20:02 EST 2025
Tue Nov 18 22:16:28 EST 2025
Sat Nov 29 07:12:03 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 9
Language English
License Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c523t-f45b551d1c7dac28bfbca41e3a6edb49f6e619a6f7ad04ed22de37f7da0beb0c3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Review-3
content type line 23
ORCID 0000-0002-7926-5081
0000-0002-5407-0835
OpenAccessLink https://www.proquest.com/docview/2716527494?pq-origsite=%requestingapplication%
PQID 2716527494
PQPubID 2032401
ParticipantIDs doaj_primary_oai_doaj_org_article_84ec5ae4ef3244c1a01a5e2ae4cfa176
swepub_primary_oai_DiVA_org_kth_319712
pubmedcentral_primary_oai_pubmedcentral_nih_gov_9497980
proquest_miscellaneous_2717688439
proquest_journals_2716527494
gale_infotracacademiconefile_A745709367
crossref_citationtrail_10_3390_e24091284
crossref_primary_10_3390_e24091284
PublicationCentury 2000
PublicationDate 2022-09-01
PublicationDateYYYYMMDD 2022-09-01
PublicationDate_xml – month: 09
  year: 2022
  text: 2022-09-01
  day: 01
PublicationDecade 2020
PublicationPlace Basel
PublicationPlace_xml – name: Basel
PublicationTitle Entropy (Basel, Switzerland)
PublicationYear 2022
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References Koetter (ref_30) 2003; 11
Wang (ref_1) 2021; 34
Shokrollahi (ref_37) 2006; 52
Li (ref_7) 2017; 55
ref_36
Li (ref_42) 2020; 6
ref_11
ref_33
ref_10
Wang (ref_21) 2015; 3
Ahlswede (ref_29) 2000; 46
Boyd (ref_16) 2011; 3
Ye (ref_39) 2021; 69
Mao (ref_41) 2018; 6
Byers (ref_35) 1998; 28
ref_18
ref_17
Yue (ref_13) 2020; 24
Yang (ref_32) 2014; 60
Dean (ref_19) 2013; 56
Hussain (ref_38) 2014; 62
Cai (ref_47) 2011; 57
Rouayheb (ref_46) 2012; 58
Chen (ref_15) 2021; 8
Precup (ref_34) 2017; Volume 70
Yue (ref_14) 2021; 20
ref_25
Yeung (ref_31) 2006; 6
ref_45
ref_22
ref_44
ref_20
Ye (ref_40) 2020; 68
Li (ref_24) 2018; 64
ref_3
ref_2
Hazan (ref_43) 2015; 28
Yue (ref_12) 2018; 14
ref_28
ref_27
Karakus (ref_23) 2019; 20
ref_26
ref_9
ref_8
Lee (ref_4) 2018; 64
ref_5
ref_6
References_xml – volume: 20
  start-page: 2619
  year: 2019
  ident: ref_23
  article-title: Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning
  publication-title: J. Mach. Learn. Res.
– ident: ref_25
  doi: 10.1109/ISIT.2018.8437467
– ident: ref_28
– volume: 6
  start-page: 18
  year: 2020
  ident: ref_42
  article-title: Communication-Censored Linearized ADMM for Decentralized Consensus Optimization
  publication-title: IEEE Trans. Signal Inf. Process. Netw.
  doi: 10.1109/MSP.2020.3003845
– ident: ref_45
  doi: 10.1109/NoF52522.2021.9609913
– ident: ref_8
  doi: 10.1109/ISIT.2018.8437669
– volume: 3
  start-page: 1
  year: 2011
  ident: ref_16
  article-title: Distributed optimization and statistical learning via the alternating direction method of multipliers
  publication-title: Found Trends Mach. Learn.
  doi: 10.1561/2200000016
– volume: 6
  start-page: 37
  year: 2006
  ident: ref_31
  article-title: Network Error Correction, Part I, Part II
  publication-title: Commun. Inf. Syst.
  doi: 10.4310/CIS.2006.v6.n1.a3
– volume: 28
  start-page: 1594
  year: 2015
  ident: ref_43
  article-title: Beyond convexity: Stochastic quasi-convex optimization
  publication-title: Adv. Neural Inf. Process. Syst.
– ident: ref_44
  doi: 10.1145/3286978.3286984
– ident: ref_33
  doi: 10.1017/CBO9780511804441
– ident: ref_5
– volume: 3
  start-page: 7
  year: 2015
  ident: ref_21
  article-title: Using straggler replication to reduce latency in large-scale parallel computing
  publication-title: ACM Sigmetrics Perform. Eval. Rev.
  doi: 10.1145/2847220.2847223
– volume: 46
  start-page: 1204
  year: 2000
  ident: ref_29
  article-title: Network information flow
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/18.850663
– ident: ref_3
– ident: ref_27
  doi: 10.1109/IWQoS49365.2020.9213028
– ident: ref_11
– volume: 24
  start-page: 362
  year: 2020
  ident: ref_13
  article-title: Coded Decentralized Learning with Gradient Descent for Big Data Analytics
  publication-title: IEEE Commun. Lett.
  doi: 10.1109/LCOMM.2019.2930513
– ident: ref_9
  doi: 10.1109/ISIT.2018.8437871
– volume: 64
  start-page: 109
  year: 2018
  ident: ref_24
  article-title: A Fundamental Tradeoff Between Computation and Communication in Distributed Computing
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/TIT.2017.2756959
– volume: 8
  start-page: 5360
  year: 2021
  ident: ref_15
  article-title: Coded Stochastic ADMM for Decentralized Consensus Optimization with Edge Computing
  publication-title: IEEE Internet Things J.
  doi: 10.1109/JIOT.2021.3058116
– ident: ref_18
– ident: ref_26
  doi: 10.1109/ISIT.2017.8006961
– volume: 11
  start-page: 782
  year: 2003
  ident: ref_30
  article-title: An Algebraic Approach to Network Coding
  publication-title: IEEE/ACM Trans. Netw. (TON)
  doi: 10.1109/TNET.2003.818197
– volume: 69
  start-page: 2844
  year: 2021
  ident: ref_39
  article-title: Randomized Neural Networks based Decentralized Multi-Task Learning via Hybrid Multi-Block ADMM
  publication-title: IEEE Trans. Signal Process.
  doi: 10.1109/TSP.2021.3078625
– volume: 56
  start-page: 74
  year: 2013
  ident: ref_19
  article-title: The tail at scale
  publication-title: Commun. ACM
  doi: 10.1145/2408776.2408794
– volume: 34
  start-page: 2574
  year: 2021
  ident: ref_1
  article-title: A Survey on Large-scale Machine Learning
  publication-title: IEEE Trans. Knowl. Data Eng.
– ident: ref_6
– volume: 52
  start-page: 2551
  year: 2006
  ident: ref_37
  article-title: Raptor codes
  publication-title: IEEE Trans. Inform. Theory
  doi: 10.1109/TIT.2006.874390
– volume: 64
  start-page: 1514
  year: 2018
  ident: ref_4
  article-title: Speeding up distributed machine learning using codes
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/TIT.2017.2736066
– volume: 58
  start-page: 1361
  year: 2012
  ident: ref_46
  article-title: Secure network coding for wiretap networks of type II
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/TIT.2011.2173631
– ident: ref_2
– volume: 55
  start-page: 34
  year: 2017
  ident: ref_7
  article-title: Coding for distributed fog computing
  publication-title: IEEE Commun. Mag.
  doi: 10.1109/MCOM.2017.1600894
– volume: 60
  start-page: 5322
  year: 2014
  ident: ref_32
  article-title: Batched sparse codes
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/TIT.2014.2334315
– volume: 57
  start-page: 424
  year: 2011
  ident: ref_47
  article-title: Secure Network Coding on a Wiretap Network
  publication-title: IEEE Trans. Inf. Theory
  doi: 10.1109/TIT.2010.2090197
– volume: 6
  start-page: 2048
  year: 2018
  ident: ref_41
  article-title: Walk Proximal Gradient: An Energy-Efficient Algorithm for Consensus Optimization
  publication-title: IEEE Internet Things J.
  doi: 10.1109/JIOT.2018.2875057
– volume: 68
  start-page: 5842
  year: 2020
  ident: ref_40
  article-title: Privacy-preserving Incremental ADMM for Decentralized Consensus Optimization
  publication-title: IEEE Trans. Signal Process.
  doi: 10.1109/TSP.2020.3027917
– volume: 20
  start-page: 1337
  year: 2021
  ident: ref_14
  article-title: Coding for Distributed Fog Computing in Internet of Mobile Things
  publication-title: IEEE Trans. Mob. Comput.
  doi: 10.1109/TMC.2019.2963668
– volume: 62
  start-page: 3725
  year: 2014
  ident: ref_38
  article-title: Buffer-based Distributed LT Codes
  publication-title: IEEE Trans. Commu.
  doi: 10.1109/TCOMM.2014.2362111
– ident: ref_17
– volume: Volume 70
  start-page: 3368
  year: 2017
  ident: ref_34
  article-title: Gradient Coding: Avoiding Stragglers in Distributed Learning
  publication-title: Proceedings of the 34th International Conference on Machine Learning
– ident: ref_36
– ident: ref_10
  doi: 10.1109/ISIT.2018.8437563
– ident: ref_22
– volume: 28
  start-page: 56
  year: 1998
  ident: ref_35
  article-title: A digital fountain approach to reliable distribution of bulk data
  publication-title: ACM SIGCOMM Comput. Commun. Rev.
  doi: 10.1145/285243.285258
– ident: ref_20
– volume: 14
  start-page: 4683
  year: 2018
  ident: ref_12
  article-title: Distributed Fog Computing Based on Batched Sparse Codes for Industrial Control
  publication-title: IEEE Trans. Ind. Inform.
  doi: 10.1109/TII.2018.2857203
SSID ssj0023216
Score 2.3172836
SecondaryResourceType review_article
Snippet This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning...
SourceID doaj
swepub
pubmedcentral
proquest
gale
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Enrichment Source
Index Database
StartPage 1284
SubjectTerms ADMM
Algorithms
Analysis
Blacklisting
Coding
Coding theory
Cognitive tasks
Communication
Communications networks
Computation
Distributed processing (Computers)
Efficiency
Error analysis
error-control coding
gradient coding
Internet of Things
Large-scale systems
Machine learning
Methods
Neural networks
Nodes
Optimization
Privacy
random codes
Review
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3Pb9UwDI7QxIELAgGisKEOIeBSLUnTpjk-9kMcxoQETLtFieOyJ1DftL3x92OnfU9UQ-LCtXElx67tfKrzWYg3YEJv0KmqbRIDlGAoD2q-wAyplTrWITfInp_as7Pu4sJ9_mPUF_eEjfTAo-EOOoPQBDTYU-k3oIJUoUFNT6APymaybWndBkxNUKvWqh15hGoC9QdIdctxJp5Vn0zSfzcV322PnJGI5sJz8kg8nE6M5WLU9LG4h8MTUR-uuOiUdOQsT7mZu_pCxsbyiHlweYQVpvJT7pPEcqJQ_f5UfDs5_nr4sZrmH1RA8HBd9aaJdKBJCmwKoLvYRwhGYR1aTNG4vkWCP6HtbUjSYNI6YW17EpYRo4T6mdgZVgM-F2WjYwQNttEAhtwQTUehCwol47NOF-L9xi4eJnJwnlHx0xNIYBP6rQkL8XorejUyYvxN6AMbdyvAJNb5AbnWT671_3JtId6xazyHGikDYboxQFti0iq_sKax0tWtLcTuxnt-isEbrwkKNgS6HWmzv12m6OFfImHA1W2WIbzV0amsEHbm9Znq85VheZl5uJ1x1nWyEG_H72P2ytHyfJF3-2N96SnPWaVf_A-jvBQPNN_AyG1uu2JnfX2Le-I-_Fovb65f5Sj4DQQhD3Y
  priority: 102
  providerName: Directory of Open Access Journals
Title Coding for Large-Scale Distributed Machine Learning
URI https://www.proquest.com/docview/2716527494
https://www.proquest.com/docview/2717688439
https://pubmed.ncbi.nlm.nih.gov/PMC9497980
https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-319712
https://doaj.org/article/84ec5ae4ef3244c1a01a5e2ae4cfa176
Volume 24
WOSCitedRecordID wos000858228600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 1099-4300
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023216
  issn: 1099-4300
  databaseCode: DOA
  dateStart: 20160101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1099-4300
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023216
  issn: 1099-4300
  databaseCode: M~E
  dateStart: 19990101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: Engineering Database
  customDbUrl:
  eissn: 1099-4300
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023216
  issn: 1099-4300
  databaseCode: M7S
  dateStart: 19990301
  isFulltext: true
  titleUrlDefault: http://search.proquest.com
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 1099-4300
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023216
  issn: 1099-4300
  databaseCode: BENPR
  dateStart: 19990301
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 1099-4300
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023216
  issn: 1099-4300
  databaseCode: PIMPY
  dateStart: 19990301
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Lb9QwEB7RlgMXHgJEaFkFhIBL1Nhx4uSEtu1WILWrFYVqOVl-pV21SsruliO_nRlvdlFUxIVLDvZEcjKe8Xz2-BuAt1boWviKJUXuCKBogX6Q0wVm64qUm0yHBNnzEzkel9NpNek23BZdWuXaJwZH7VpLe-T7HAP7HCFUJT7e_EioahSdrnYlNLZgh1gSWEjdO9sAroyzYsUmlCG03_e4elXkj3trUKDqv-uQ7yZJ9qhEw_Jz_Oh_B_4YHnaBZzxczZQncM83TyE7bGntijFyjU8oJzw5Q535-IjodKkSlnfxaUi39HHHxHrxDL4dj74efkq6MgqJRZS5TGqRG4yLHLPSactLUxurBfOZLrwzoqoLjyhKF7XULhXece58JmsUTo03qc2ew3bTNv4FxDk3xnIrc26tQG0aUaIHsMynBPNKHsGH9Y9VtuMYp1IX1wqxBulAbXQQwZuN6M2KWONvQgeknY0AcWGHhnZ-oTrTUqXwNtde-BqDQ2GZTpnOPccWW2smiwjek24VWSwOxuru4gF-EnFfqaEUuUyrrJAR7K1VqDpTXqg_-ovg9aYbjZBOVnTj29sgg7CtxOAuAtmbNr2h93ua2WWg865EJasyjeDdaoL1XjmanQ_D114tLxW6S8n4y3-PchcecLqiEfLg9mB7Ob_1r-C-_bmcLeYD2JLTcgA7B6Px5Msg7EAMgtHQ89cIeyafTyfffwPMSyLg
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9QwEB2VggQXPgSIQIGA-LpETRwnjg8ILV2qVt2ukFqq3ozjTNoVaLfsbkH8KX4jM94kKCri1gNXZxI58cubecl4BuCFk7aWqJMozyoWKFYSDwrewOyqPBZlan2C7NFIjcfF8bH-uAa_2r0wnFbZcqIn6mrm-Bv5pqDAPiMJpeW7s28Rd43iv6ttC40VLPbw5w-SbIu3u0Na35dCbH843NqJmq4CkSPRtYxqmZUUJlSJU5V1oijr0lmZYGpzrEqp6xxJVNi8VraKJVZCVJiqmozjEsvYpXTdK3CVwgihfargQSfwUpHkq-pFaarjTSRvqZn_ez7Ptwa46AAuJmX2Spd6d7d96397ULfhZhNYh4PVm3AH1nB6F9KtGfvmkCLzcMQ579EBYRLDIZcL5k5fWIX7Pp0Uw6bS7Mk9-HQp87wP69PZFB9AmImydMKpTDgnCa2lLIjhXIIxy9hCBPCmXUjjmhrq3MrjqyEtxWtuujUP4HlnerYqHPI3o_eMhs6Aa337gdn8xDTUYQqJLrMosabgV7rExonNUNCIq22i8gBeM5YMMxJNxtlmYwXdEtf2MgMlMxXrNFcBbLSQMQ1VLcwfvATwrDtMJMN_juwUZ-fehmRpQcFrAKoH097U-0emk1NfrlxLrXQRB_BqBejeKcPJ0cDf7ZflqSF3oBLx8N-zfArXdw73R2a0O957BDcEb0fxOX8bsL6cn-NjuOa-LyeL-RP_cobw-bJB_hvfv36N
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9NAEB2VFCEufAgQpgUM4utixV6vvfYBodAQETWNIgFVOS3r9biNqOySpCD-Gr-OGccxsoq49cB1PbbW3ueZefbsG4BnVppCYhp4cZQzQTGS_KDgDcw2j32RhaYukD2cqOk0OTpKZ1vwa7MXhssqNz6xdtR5ZfkbeV9QYh8RhUplv2jKImbD0Zuzbx53kOI_rZt2GmuI7OPPH0Tflq_HQ1rr50KM3n3ce-81HQY8SwRs5RUyyihlyAOrcmNFkhWZNTLA0MSYZzItYiSCYeJCmdyXmAuRY6gKMvYzzHwb0nWvwDal5FL0YHs2Pph9buleKIJ4rWUUhqnfR4qdKUeDTgSsGwVcDAcXSzQ7QqZ18Bvd_J8f2y240aTc7mD9jtyGLSzvQLhXcdR2KWd3J1wN730gtKI7ZCFh7gGGuXtQF5qi22jQHt-FT5cyz3vQK6sS74MbiSyzwqpIWCsJx5lMyPfZAH0muIlw4NVmUbVt1NW5ycepJpbF66_b9XfgaWt6tpYU-ZvRW0ZGa8Aq4PVAtTjWjVPRiUQbGZRYUFosbWD8wEQoaMQWJlCxAy8ZV5p9FU3GmmbLBd0Sq37pgZKR8tMwVg7sbuCjGye21H-w48CT9jC5H_6nZEqszmsbIqwJpbUOqA5kO1PvHinnJ7WQeSpTlSa-Ay_W4O6cMpwfDuq7_bo60RQoVCAe_HuWj-EaYVtPxtP9HbgueJ9KXQy4C73V4hwfwlX7fTVfLh41b6oLXy4b5b8BSFmIww
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Coding+for+Large-Scale+Distributed+Machine+Learning&rft.jtitle=Entropy+%28Basel%2C+Switzerland%29&rft.au=Xiao%2C+Ming&rft.au=Skoglund%2C+Mikael&rft.date=2022-09-01&rft.pub=MDPI+AG&rft.eissn=1099-4300&rft.volume=24&rft.issue=9&rft.spage=1284&rft_id=info:doi/10.3390%2Fe24091284&rft.externalDBID=HAS_PDF_LINK
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1099-4300&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1099-4300&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1099-4300&client=summon