Coding for Large-Scale Distributed Machine Learning
This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distribut...
Uložené v:
| Vydané v: | Entropy (Basel, Switzerland) Ročník 24; číslo 9; s. 1284 |
|---|---|
| Hlavní autori: | , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Basel
MDPI AG
01.09.2022
MDPI |
| Predmet: | |
| ISSN: | 1099-4300, 1099-4300 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal–dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given. |
|---|---|
| AbstractList | This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal–dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given. This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal-dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given.This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning (DML). With increasing data volumes and the pervasive deployment of sensors and computing machines, machine learning has become more distributed. Moreover, the involved computing nodes and data volumes for learning tasks have also increased significantly. For large-scale distributed learning systems, significant challenges have appeared in terms of delay, errors, efficiency, etc. To address the problems, various error-control or performance-boosting schemes have been proposed recently for different aspects, such as the duplication of computing nodes. More recently, error-control coding has been investigated for DML to improve reliability and efficiency. The benefits of coding for DML include high-efficiency, low complexity, etc. Despite the benefits and recent progress, however, there is still a lack of comprehensive survey on this topic, especially for large-scale learning. This paper seeks to introduce the theories and algorithms of coding for DML. For primal-based DML schemes, we first discuss the gradient coding with the optimal code distance. Then, we introduce random coding for gradient-based DML. For primal-dual-based DML, i.e., ADMM (alternating direction method of multipliers), we propose a separate coding method for two steps of distributed optimization. Then coding schemes for different steps are discussed. Finally, a few potential directions for future works are also given. |
| Audience | Academic |
| Author | Skoglund, Mikael Xiao, Ming |
| AuthorAffiliation | Division of Information Science and Engineering, Royal Institute of Technology, Malvinas Vag 10, KTH, 100-44 Stockholm, Sweden |
| AuthorAffiliation_xml | – name: Division of Information Science and Engineering, Royal Institute of Technology, Malvinas Vag 10, KTH, 100-44 Stockholm, Sweden |
| Author_xml | – sequence: 1 givenname: Ming orcidid: 0000-0002-5407-0835 surname: Xiao fullname: Xiao, Ming – sequence: 2 givenname: Mikael orcidid: 0000-0002-7926-5081 surname: Skoglund fullname: Skoglund, Mikael |
| BackLink | https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-319712$$DView record from Swedish Publication Index (Kungliga Tekniska Högskolan) |
| BookMark | eNplkktv1DAUhSNUJNrCgn8QiQ0s0vqVON4gjaYUKg1iwWNr3djXGQ-ZuNgJiH9fT1MQU-SFrePvHF_b96w4GcOIRfGSkgvOFblEJoiirBVPilNKlKoEJ-Tkn_Wz4iylHSGMM9qcFnwdrB_70oVYbiD2WH02MGB55dMUfTdPaMuPYLZ-xHKDEMcMPy-eOhgSvniYz4uv1---rD9Um0_vb9arTWVqxqfKibqra2qpkRYMazvXGRAUOTRoO6Fcgw1V0DgJlgi0jFnk0mWYdNgRw8-LmyXXBtjp2-j3EH_rAF7fCyH2GuLkzYC6FWhqQIGOMyEMBUKhRpYV44DKJmdVS1b6hbdzd5R25b-t7tO-T1vNqZKUZf7twmd4j9bgOEUYjmzHO6Pf6j781EooqVqSA14_BMTwY8Y06b1PBocBRgxz0kzmstpWcJXRV4_QXZjjmJ_2QDU1k0KJTF0sVJ__R_vRhXyuycPi3pvcBc5nfSVFLYnijcyGy8VgYkgpotPGTzD5cCjYD5oSfWgZ_bdlsuPNI8ef-_7P3gGqZMLY |
| CitedBy_id | crossref_primary_10_1109_JIOT_2024_3394714 crossref_primary_10_1109_TCCN_2024_3502495 crossref_primary_10_1186_s13634_025_01225_8 crossref_primary_10_3390_e24111604 crossref_primary_10_1109_TWC_2024_3366547 crossref_primary_10_3390_e25030392 |
| Cites_doi | 10.1109/ISIT.2018.8437467 10.1109/MSP.2020.3003845 10.1109/NoF52522.2021.9609913 10.1109/ISIT.2018.8437669 10.1561/2200000016 10.4310/CIS.2006.v6.n1.a3 10.1145/3286978.3286984 10.1017/CBO9780511804441 10.1145/2847220.2847223 10.1109/18.850663 10.1109/IWQoS49365.2020.9213028 10.1109/LCOMM.2019.2930513 10.1109/ISIT.2018.8437871 10.1109/TIT.2017.2756959 10.1109/JIOT.2021.3058116 10.1109/ISIT.2017.8006961 10.1109/TNET.2003.818197 10.1109/TSP.2021.3078625 10.1145/2408776.2408794 10.1109/TIT.2006.874390 10.1109/TIT.2017.2736066 10.1109/TIT.2011.2173631 10.1109/MCOM.2017.1600894 10.1109/TIT.2014.2334315 10.1109/TIT.2010.2090197 10.1109/JIOT.2018.2875057 10.1109/TSP.2020.3027917 10.1109/TMC.2019.2963668 10.1109/TCOMM.2014.2362111 10.1109/ISIT.2018.8437563 10.1145/285243.285258 10.1109/TII.2018.2857203 |
| ContentType | Journal Article |
| Copyright | COPYRIGHT 2022 MDPI AG 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2022 by the authors. 2022 |
| Copyright_xml | – notice: COPYRIGHT 2022 MDPI AG – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2022 by the authors. 2022 |
| DBID | AAYXX CITATION 7TB 8FD 8FE 8FG ABJCF ABUWG AFKRA AZQEC BENPR BGLVJ CCPQU DWQXO FR3 HCIFZ KR7 L6V M7S PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS PTHSS 7X8 5PM ADTPV AFDQA AOWAS D8T D8V ZZAVC DOA |
| DOI | 10.3390/e24091284 |
| DatabaseName | CrossRef Mechanical & Transportation Engineering Abstracts Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials - QC ProQuest Central ProQuest Technology Collection ProQuest One ProQuest Central Korea Engineering Research Database SciTech Premium Collection Civil Engineering Abstracts ProQuest Engineering Collection Engineering Database ProQuest Central Premium ProQuest One Academic (New) ProQuest Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic (retired) ProQuest One Academic UKI Edition ProQuest Central China Engineering Collection MEDLINE - Academic PubMed Central (Full Participant titles) SwePub SWEPUB Kungliga Tekniska Högskolan full text SwePub Articles SWEPUB Freely available online SWEPUB Kungliga Tekniska Högskolan SwePub Articles full text DOAJ Directory of Open Access Journals |
| DatabaseTitle | CrossRef Publicly Available Content Database Technology Collection Technology Research Database ProQuest One Academic Middle East (New) Mechanical & Transportation Engineering Abstracts ProQuest Central Essentials ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest Engineering Collection ProQuest Central Korea ProQuest Central (New) Engineering Collection Civil Engineering Abstracts Engineering Database ProQuest One Academic Eastern Edition ProQuest Technology Collection ProQuest SciTech Collection ProQuest One Academic UKI Edition Materials Science & Engineering Collection Engineering Research Database ProQuest One Academic ProQuest One Academic (New) MEDLINE - Academic |
| DatabaseTitleList | Publicly Available Content Database CrossRef MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: PIMPY name: Publicly Available Content Database url: http://search.proquest.com/publiccontent sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| EISSN | 1099-4300 |
| ExternalDocumentID | oai_doaj_org_article_84ec5ae4ef3244c1a01a5e2ae4cfa176 oai_DiVA_org_kth_319712 PMC9497980 A745709367 10_3390_e24091284 |
| GeographicLocations | Sweden |
| GeographicLocations_xml | – name: Sweden |
| GrantInformation_xml | – fundername: Swedish Research Council (VR) grantid: 2021-04772 |
| GroupedDBID | 29G 2WC 5GY 5VS 8FE 8FG AADQD AAFWJ AAYXX ABDBF ABJCF ACIWK ACUHS ADBBV AEGXH AENEX AFFHD AFKRA AFPKN AFZYC ALMA_UNASSIGNED_HOLDINGS BCNDV BENPR BGLVJ CCPQU CITATION CS3 DU5 E3Z ESX F5P GROUPED_DOAJ GX1 HCIFZ HH5 IAO ITC J9A KQ8 L6V M7S MODMG M~E OK1 OVT PGMZT PHGZM PHGZT PIMPY PQGLB PROAC PTHSS RNS RPM TR2 TUS XSB ~8M 7TB 8FD ABUWG AZQEC DWQXO FR3 KR7 PKEHL PQEST PQQKQ PQUKI PRINS 7X8 PUEGO 5PM ADTPV AFDQA AOWAS C1A CH8 D8T D8V IPNFZ RIG ZZAVC |
| ID | FETCH-LOGICAL-c523t-f45b551d1c7dac28bfbca41e3a6edb49f6e619a6f7ad04ed22de37f7da0beb0c3 |
| IEDL.DBID | DOA |
| ISICitedReferencesCount | 6 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000858228600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1099-4300 |
| IngestDate | Fri Oct 03 12:53:08 EDT 2025 Tue Nov 04 16:58:40 EST 2025 Tue Nov 04 02:07:12 EST 2025 Thu Sep 04 17:14:20 EDT 2025 Fri Jul 25 11:53:25 EDT 2025 Tue Nov 04 18:20:02 EST 2025 Tue Nov 18 22:16:28 EST 2025 Sat Nov 29 07:12:03 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 9 |
| Language | English |
| License | Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c523t-f45b551d1c7dac28bfbca41e3a6edb49f6e619a6f7ad04ed22de37f7da0beb0c3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 ObjectType-Review-3 content type line 23 |
| ORCID | 0000-0002-7926-5081 0000-0002-5407-0835 |
| OpenAccessLink | https://doaj.org/article/84ec5ae4ef3244c1a01a5e2ae4cfa176 |
| PQID | 2716527494 |
| PQPubID | 2032401 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_84ec5ae4ef3244c1a01a5e2ae4cfa176 swepub_primary_oai_DiVA_org_kth_319712 pubmedcentral_primary_oai_pubmedcentral_nih_gov_9497980 proquest_miscellaneous_2717688439 proquest_journals_2716527494 gale_infotracacademiconefile_A745709367 crossref_citationtrail_10_3390_e24091284 crossref_primary_10_3390_e24091284 |
| PublicationCentury | 2000 |
| PublicationDate | 2022-09-01 |
| PublicationDateYYYYMMDD | 2022-09-01 |
| PublicationDate_xml | – month: 09 year: 2022 text: 2022-09-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | Basel |
| PublicationPlace_xml | – name: Basel |
| PublicationTitle | Entropy (Basel, Switzerland) |
| PublicationYear | 2022 |
| Publisher | MDPI AG MDPI |
| Publisher_xml | – name: MDPI AG – name: MDPI |
| References | Koetter (ref_30) 2003; 11 Wang (ref_1) 2021; 34 Shokrollahi (ref_37) 2006; 52 Li (ref_7) 2017; 55 ref_36 Li (ref_42) 2020; 6 ref_11 ref_33 ref_10 Wang (ref_21) 2015; 3 Ahlswede (ref_29) 2000; 46 Boyd (ref_16) 2011; 3 Ye (ref_39) 2021; 69 Mao (ref_41) 2018; 6 Byers (ref_35) 1998; 28 ref_18 ref_17 Yue (ref_13) 2020; 24 Yang (ref_32) 2014; 60 Dean (ref_19) 2013; 56 Hussain (ref_38) 2014; 62 Cai (ref_47) 2011; 57 Rouayheb (ref_46) 2012; 58 Chen (ref_15) 2021; 8 Precup (ref_34) 2017; Volume 70 Yue (ref_14) 2021; 20 ref_25 Yeung (ref_31) 2006; 6 ref_45 ref_22 ref_44 ref_20 Ye (ref_40) 2020; 68 Li (ref_24) 2018; 64 ref_3 ref_2 Hazan (ref_43) 2015; 28 Yue (ref_12) 2018; 14 ref_28 ref_27 Karakus (ref_23) 2019; 20 ref_26 ref_9 ref_8 Lee (ref_4) 2018; 64 ref_5 ref_6 |
| References_xml | – volume: 20 start-page: 2619 year: 2019 ident: ref_23 article-title: Redundancy Techniques for Straggler Mitigation in Distributed Optimization and Learning publication-title: J. Mach. Learn. Res. – ident: ref_25 doi: 10.1109/ISIT.2018.8437467 – ident: ref_28 – volume: 6 start-page: 18 year: 2020 ident: ref_42 article-title: Communication-Censored Linearized ADMM for Decentralized Consensus Optimization publication-title: IEEE Trans. Signal Inf. Process. Netw. doi: 10.1109/MSP.2020.3003845 – ident: ref_45 doi: 10.1109/NoF52522.2021.9609913 – ident: ref_8 doi: 10.1109/ISIT.2018.8437669 – volume: 3 start-page: 1 year: 2011 ident: ref_16 article-title: Distributed optimization and statistical learning via the alternating direction method of multipliers publication-title: Found Trends Mach. Learn. doi: 10.1561/2200000016 – volume: 6 start-page: 37 year: 2006 ident: ref_31 article-title: Network Error Correction, Part I, Part II publication-title: Commun. Inf. Syst. doi: 10.4310/CIS.2006.v6.n1.a3 – volume: 28 start-page: 1594 year: 2015 ident: ref_43 article-title: Beyond convexity: Stochastic quasi-convex optimization publication-title: Adv. Neural Inf. Process. Syst. – ident: ref_44 doi: 10.1145/3286978.3286984 – ident: ref_33 doi: 10.1017/CBO9780511804441 – ident: ref_5 – volume: 3 start-page: 7 year: 2015 ident: ref_21 article-title: Using straggler replication to reduce latency in large-scale parallel computing publication-title: ACM Sigmetrics Perform. Eval. Rev. doi: 10.1145/2847220.2847223 – volume: 46 start-page: 1204 year: 2000 ident: ref_29 article-title: Network information flow publication-title: IEEE Trans. Inf. Theory doi: 10.1109/18.850663 – ident: ref_3 – ident: ref_27 doi: 10.1109/IWQoS49365.2020.9213028 – ident: ref_11 – volume: 24 start-page: 362 year: 2020 ident: ref_13 article-title: Coded Decentralized Learning with Gradient Descent for Big Data Analytics publication-title: IEEE Commun. Lett. doi: 10.1109/LCOMM.2019.2930513 – ident: ref_9 doi: 10.1109/ISIT.2018.8437871 – volume: 64 start-page: 109 year: 2018 ident: ref_24 article-title: A Fundamental Tradeoff Between Computation and Communication in Distributed Computing publication-title: IEEE Trans. Inf. Theory doi: 10.1109/TIT.2017.2756959 – volume: 8 start-page: 5360 year: 2021 ident: ref_15 article-title: Coded Stochastic ADMM for Decentralized Consensus Optimization with Edge Computing publication-title: IEEE Internet Things J. doi: 10.1109/JIOT.2021.3058116 – ident: ref_18 – ident: ref_26 doi: 10.1109/ISIT.2017.8006961 – volume: 11 start-page: 782 year: 2003 ident: ref_30 article-title: An Algebraic Approach to Network Coding publication-title: IEEE/ACM Trans. Netw. (TON) doi: 10.1109/TNET.2003.818197 – volume: 69 start-page: 2844 year: 2021 ident: ref_39 article-title: Randomized Neural Networks based Decentralized Multi-Task Learning via Hybrid Multi-Block ADMM publication-title: IEEE Trans. Signal Process. doi: 10.1109/TSP.2021.3078625 – volume: 56 start-page: 74 year: 2013 ident: ref_19 article-title: The tail at scale publication-title: Commun. ACM doi: 10.1145/2408776.2408794 – volume: 34 start-page: 2574 year: 2021 ident: ref_1 article-title: A Survey on Large-scale Machine Learning publication-title: IEEE Trans. Knowl. Data Eng. – ident: ref_6 – volume: 52 start-page: 2551 year: 2006 ident: ref_37 article-title: Raptor codes publication-title: IEEE Trans. Inform. Theory doi: 10.1109/TIT.2006.874390 – volume: 64 start-page: 1514 year: 2018 ident: ref_4 article-title: Speeding up distributed machine learning using codes publication-title: IEEE Trans. Inf. Theory doi: 10.1109/TIT.2017.2736066 – volume: 58 start-page: 1361 year: 2012 ident: ref_46 article-title: Secure network coding for wiretap networks of type II publication-title: IEEE Trans. Inf. Theory doi: 10.1109/TIT.2011.2173631 – ident: ref_2 – volume: 55 start-page: 34 year: 2017 ident: ref_7 article-title: Coding for distributed fog computing publication-title: IEEE Commun. Mag. doi: 10.1109/MCOM.2017.1600894 – volume: 60 start-page: 5322 year: 2014 ident: ref_32 article-title: Batched sparse codes publication-title: IEEE Trans. Inf. Theory doi: 10.1109/TIT.2014.2334315 – volume: 57 start-page: 424 year: 2011 ident: ref_47 article-title: Secure Network Coding on a Wiretap Network publication-title: IEEE Trans. Inf. Theory doi: 10.1109/TIT.2010.2090197 – volume: 6 start-page: 2048 year: 2018 ident: ref_41 article-title: Walk Proximal Gradient: An Energy-Efficient Algorithm for Consensus Optimization publication-title: IEEE Internet Things J. doi: 10.1109/JIOT.2018.2875057 – volume: 68 start-page: 5842 year: 2020 ident: ref_40 article-title: Privacy-preserving Incremental ADMM for Decentralized Consensus Optimization publication-title: IEEE Trans. Signal Process. doi: 10.1109/TSP.2020.3027917 – volume: 20 start-page: 1337 year: 2021 ident: ref_14 article-title: Coding for Distributed Fog Computing in Internet of Mobile Things publication-title: IEEE Trans. Mob. Comput. doi: 10.1109/TMC.2019.2963668 – volume: 62 start-page: 3725 year: 2014 ident: ref_38 article-title: Buffer-based Distributed LT Codes publication-title: IEEE Trans. Commu. doi: 10.1109/TCOMM.2014.2362111 – ident: ref_17 – volume: Volume 70 start-page: 3368 year: 2017 ident: ref_34 article-title: Gradient Coding: Avoiding Stragglers in Distributed Learning publication-title: Proceedings of the 34th International Conference on Machine Learning – ident: ref_36 – ident: ref_10 doi: 10.1109/ISIT.2018.8437563 – ident: ref_22 – volume: 28 start-page: 56 year: 1998 ident: ref_35 article-title: A digital fountain approach to reliable distribution of bulk data publication-title: ACM SIGCOMM Comput. Commun. Rev. doi: 10.1145/285243.285258 – ident: ref_20 – volume: 14 start-page: 4683 year: 2018 ident: ref_12 article-title: Distributed Fog Computing Based on Batched Sparse Codes for Industrial Control publication-title: IEEE Trans. Ind. Inform. doi: 10.1109/TII.2018.2857203 |
| SSID | ssj0023216 |
| Score | 2.3173778 |
| SecondaryResourceType | review_article |
| Snippet | This article aims to give a comprehensive and rigorous review of the principles and recent development of coding for large-scale distributed machine learning... |
| SourceID | doaj swepub pubmedcentral proquest gale crossref |
| SourceType | Open Website Open Access Repository Aggregation Database Enrichment Source Index Database |
| StartPage | 1284 |
| SubjectTerms | ADMM Algorithms Analysis Blacklisting Coding Coding theory Cognitive tasks Communication Communications networks Computation Distributed processing (Computers) Efficiency Error analysis error-control coding gradient coding Internet of Things Large-scale systems Machine learning Methods Neural networks Nodes Optimization Privacy random codes Review |
| SummonAdditionalLinks | – databaseName: Engineering Database dbid: M7S link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Lb9QwEB5B4cCFhwARKCggBFyi2o4Txye0tFQcoEIqoN4sxx63K1BSdrf9_R17vYuiIi5c7Yk0znhe9vgbgNfWMm2Rd5XnTVfJWqrKMtVX2kpH7pcp0fap2YQ6OupOTvTXfOC2zGWVG5uYDLUfXTwj3xMU2DeUQmn5_vx3FbtGxdvV3ELjJtyKKAk8le4dbxOuWvB2jSZUU2q_h-S9dLTHEx-UoPqvG-TrRZITKNHkfg7v_S_j9-FuDjzL2XqnPIAbODyEen-MvqukyLX8HGvCq2OSGZYHEU43dsJCX35J5ZZYZiTW00fw_fDjt_1PVW6jUDnKMldVkE1PcZHnTnnrRNeH3lnJsbYt-l7q0CJlUbYNynom0QvhsVaBiFmPPXP1Y9gZxgGfQKkt71RgvebBScWCpWBGdG1tGWrPGBbwbvNjjcsY47HVxS9DuUaUgdnKoIBXW9LzNbDG34g-ROlsCSIWdhoYF6cmq5bpJLrGosRAwaF03DJuGxQ04oLlqi3gbZStiRpLzDibHx7QkiL2lZkp2Sim61YVsLsRocmqvDR_5FfAy-00KWG8WbEDjheJhtK2joK7AtRk20xYn84M87ME562lVrpjBbxZb7DJJwfzH7O02p-rM0PmUnHx9N9cPoM7Ij7RSHVwu7CzWlzgc7jtLlfz5eJFUpAr6YMcpw priority: 102 providerName: ProQuest |
| Title | Coding for Large-Scale Distributed Machine Learning |
| URI | https://www.proquest.com/docview/2716527494 https://www.proquest.com/docview/2717688439 https://pubmed.ncbi.nlm.nih.gov/PMC9497980 https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-319712 https://doaj.org/article/84ec5ae4ef3244c1a01a5e2ae4cfa176 |
| Volume | 24 |
| WOSCitedRecordID | wos000858228600001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 1099-4300 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023216 issn: 1099-4300 databaseCode: DOA dateStart: 20160101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 1099-4300 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023216 issn: 1099-4300 databaseCode: M~E dateStart: 19990101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVPQU databaseName: Engineering Database customDbUrl: eissn: 1099-4300 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023216 issn: 1099-4300 databaseCode: M7S dateStart: 19990301 isFulltext: true titleUrlDefault: http://search.proquest.com providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: eissn: 1099-4300 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023216 issn: 1099-4300 databaseCode: BENPR dateStart: 19990301 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: Publicly Available Content Database customDbUrl: eissn: 1099-4300 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0023216 issn: 1099-4300 databaseCode: PIMPY dateStart: 19990301 isFulltext: true titleUrlDefault: http://search.proquest.com/publiccontent providerName: ProQuest |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3BbtQwELWgcOCCQIAIlFVACLhEtR0nto_bdiuQ6GpFAS0na-KM6QqURe2WI9_O2MmuiIrEhUsO9kSyZzKeecr4DWMvAbgFFKZoRWUKVSpdANdNYUF5Cr9cy7pJzSb0fG6WS7v4o9VXrAnr6YF7xR0Yhb4CVBgo9CsvgAuoUNKIDyB0Itvm2m7B1AC1SinqnkeoJFB_gBS3bDyJR9EnkfRfP4qvl0eOSERT4Dm5x-4OGWM-7Vd6n93A7gErj9Yx6OSUcubvYzF3cUbKxvw48uDGFlbY5qepThLzgUL160P26WT28ehtMfQ_KDzBw00RVNVQQtMKr1vw0jSh8aAEllBj2ygbaiT4A3XQ0HKFrZQtljqQMG-w4b58xPa6dYePWW5BGB14Y0XwSvMAlIVIU5fA0bacY8bebPXi_EAOHntUfHcEEqIK3U6FGXuxE_3RM2L8TegwKncnEEms0wCZ1g2mdf8ybcZeR9O46Gq0GA_DjQHaUiStclOtKs1tWeuM7W-t5wYfvHSSoGBFoNvSap7vpsl74i8R6HB9lWQIbxnKyjKmR1YfLX08063OEw-3VVZbwzP2qv8-Rq8crz5P026_bc4dnXNayCf_QylP2R0Zb2CkMrd9tre5uMJn7Lb_uVldXkzYTb00E3brcDZffJgkn5jEctaz-Pw1o5nFu9PFl9_HiBX0 |
| linkProvider | Directory of Open Access Journals |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9QwEB6VggQXHgJEoEBAvC5RbceJ4wNCS5eqVbcrpBbUm-s4k3YF2i27WxB_it_I2JsERUXceuBqTyI7_jyPePwNwAtrmbbIi6TiWZHIVKrEMlUm2kpH5pcpkZeh2IQaj4ujI_1xDX61d2F8WmWrE4OirmbO_yPfFOTYZxRCafnu7Fviq0b509W2hMYKFnv48weFbIu3u0Na35dCbH843NpJmqoCiaOga5nUMivJTai4U5V1oijr0lnJMbU5VqXUdY4UVNi8VrZiEishKkxVTcKsxJK5lN57Ba6SGyF0SBU86AK8VPB8xV6UppptIllL7fV_z-aF0gAXDcDFpMwedWkwd9u3_rcPdRtuNo51PFjthDuwhtO7kG7NvG2OyTOPRz7nPTkgTGI89HTBvtIXVvF-SCfFuGGaPbkHny5lnPdhfTqb4gOIteWFqlmpee2kYrUlZ00UeWoZ6ooxjOBNu5DGNRzqvpTHV0OxlF9z0615BM870bMVccjfhN57NHQCnus7NMzmJ6ZRHaaQ6DKLEmtyfqXjlnGboaAWV1uu8gheeywZr5FoMM42FytoSp7bywyUzBTTaa4i2GghYxpVtTB_8BLBs66blIw_ObJTnJ0HGQpLC3JeI1A9mPaG3u-ZTk4DXbmWWumCRfBqBejeI8PJ50GY7ZflqSFzoLh4-O9RPoXrO4f7IzPaHe89ghvCX0cJOX8bsL6cn-NjuOa-LyeL-ZOwOWM4vmyQ_wZF536u |
| linkToPdf | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1bb9MwFD4aHUK8cBEgAgMC4vYS1XacOH5AqKxUVNuqSly0PRnHOd4qUDvaDsRf49dxnKZF0RBve-DVPons-PO5xMffAXhqLdMWeZFUPCsSmUqVWKbKRFvpyPwyJfKyLjahRqPi8FCPt-DX-i5MSKtc68RaUVczF_6RdwU59hmFUFp2fZMWMe4PXp9-S0IFqXDSui6nsYLIHv78QeHb4tWwT2v9TIjB2w-775KmwkDiKABbJl5mJbkMFXeqsk4UpS-dlRxTm2NVSu1zpADD5l7ZikmshKgwVZ6EWYklcym99xJsk0suRQe2x8OD8dEm3EsFz1dcRmmqWRfJdupgDVoWsC4UcN4cnE_RbBGZ1sZvcP1__mw34Frjcse91R65CVs4vQXp7ixY7Zh89ng_ZMMn7wmtGPcDkXCoAYZVfFAnmmLccNAe34aPFzLOO9CZzqZ4F2JteaE8KzX3TirmLblxoshTy1BXjGEEL9eLalzDrh6KfHw1FGWF9Teb9Y_gyUb0dEUp8jehNwEZG4HAAl43zObHplEqppDoMosSPbnF0nHLuM1QUIvzlqs8ghcBVyboKhqMs82VC5pSYP0yPSUzxXSaqwh21vAxjRJbmD_YieDxppvUTzhTslOcndUyFLAW5NZGoFqQbQ293TOdnNRE5lpqpQsWwfMVuFuP9CefevVsvyxPDBkKxcW9f4_yEVwhbJv94WjvPlwV4Z5KnQy4A53l_AwfwGX3fTlZzB82OzWGzxeN8t-v_4jk |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Coding+for+Large-Scale+Distributed+Machine+Learning&rft.jtitle=Entropy+%28Basel%2C+Switzerland%29&rft.au=Xiao%2C+Ming&rft.au=Skoglund%2C+Mikael&rft.date=2022-09-01&rft.issn=1099-4300&rft.eissn=1099-4300&rft.volume=24&rft.issue=9&rft.spage=1284&rft_id=info:doi/10.3390%2Fe24091284&rft.externalDBID=n%2Fa&rft.externalDocID=10_3390_e24091284 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1099-4300&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1099-4300&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1099-4300&client=summon |