Convergence Analysis of Distributed Gradient Descent Algorithms With One and Two Momentum Terms

For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the con...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on cybernetics Jg. 54; H. 3; S. 1511 - 1522
Hauptverfasser: Liu, Bing, Chai, Li, Yi, Jingwen
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States IEEE 01.03.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:2168-2267, 2168-2275, 2168-2275
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the convergence rate. This article is aimed at studying the issue in the distributed setup, where <inline-formula> <tex-math notation="LaTeX">N </tex-math></inline-formula> agents minimize the sum of their individual cost functions using local communication over a network. The cost functions are twice continuously differentiable. We first study the algorithm with one momentum term and develop a distributed heavy-ball (D-HB) method by adding one momentum term on to the distributed gradient algorithm. By borrowing tools from the control theory, we provide a simple convergence proof and an explicit expression of the optimal convergence rate. Furthermore, we consider adding two momentum terms case and propose a distributed double-heavy-ball (D-DHB) method. We show that adding one momentum term allows faster convergence while adding two momentum terms does not perform any superiorities. Finally, simulation examples are given to illustrate our findings.
AbstractList For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the convergence rate. This article is aimed at studying the issue in the distributed setup, where N agents minimize the sum of their individual cost functions using local communication over a network. The cost functions are twice continuously differentiable. We first study the algorithm with one momentum term and develop a distributed heavy-ball (D-HB) method by adding one momentum term on to the distributed gradient algorithm. By borrowing tools from the control theory, we provide a simple convergence proof and an explicit expression of the optimal convergence rate. Furthermore, we consider adding two momentum terms case and propose a distributed double-heavy-ball (D-DHB) method. We show that adding one momentum term allows faster convergence while adding two momentum terms does not perform any superiorities. Finally, simulation examples are given to illustrate our findings.
For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the convergence rate. This article is aimed at studying the issue in the distributed setup, where N agents minimize the sum of their individual cost functions using local communication over a network. The cost functions are twice continuously differentiable. We first study the algorithm with one momentum term and develop a distributed heavy-ball (D-HB) method by adding one momentum term on to the distributed gradient algorithm. By borrowing tools from the control theory, we provide a simple convergence proof and an explicit expression of the optimal convergence rate. Furthermore, we consider adding two momentum terms case and propose a distributed double-heavy-ball (D-DHB) method. We show that adding one momentum term allows faster convergence while adding two momentum terms does not perform any superiorities. Finally, simulation examples are given to illustrate our findings.For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the convergence rate. This article is aimed at studying the issue in the distributed setup, where N agents minimize the sum of their individual cost functions using local communication over a network. The cost functions are twice continuously differentiable. We first study the algorithm with one momentum term and develop a distributed heavy-ball (D-HB) method by adding one momentum term on to the distributed gradient algorithm. By borrowing tools from the control theory, we provide a simple convergence proof and an explicit expression of the optimal convergence rate. Furthermore, we consider adding two momentum terms case and propose a distributed double-heavy-ball (D-DHB) method. We show that adding one momentum term allows faster convergence while adding two momentum terms does not perform any superiorities. Finally, simulation examples are given to illustrate our findings.
For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the convergence rate. This article is aimed at studying the issue in the distributed setup, where <inline-formula> <tex-math notation="LaTeX">N </tex-math></inline-formula> agents minimize the sum of their individual cost functions using local communication over a network. The cost functions are twice continuously differentiable. We first study the algorithm with one momentum term and develop a distributed heavy-ball (D-HB) method by adding one momentum term on to the distributed gradient algorithm. By borrowing tools from the control theory, we provide a simple convergence proof and an explicit expression of the optimal convergence rate. Furthermore, we consider adding two momentum terms case and propose a distributed double-heavy-ball (D-DHB) method. We show that adding one momentum term allows faster convergence while adding two momentum terms does not perform any superiorities. Finally, simulation examples are given to illustrate our findings.
For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than the gradient method. However, for the distributed counterpart, there is quite few results about the effect of added momentum terms on the convergence rate. This article is aimed at studying the issue in the distributed setup, where [Formula Omitted] agents minimize the sum of their individual cost functions using local communication over a network. The cost functions are twice continuously differentiable. We first study the algorithm with one momentum term and develop a distributed heavy-ball (D-HB) method by adding one momentum term on to the distributed gradient algorithm. By borrowing tools from the control theory, we provide a simple convergence proof and an explicit expression of the optimal convergence rate. Furthermore, we consider adding two momentum terms case and propose a distributed double-heavy-ball (D-DHB) method. We show that adding one momentum term allows faster convergence while adding two momentum terms does not perform any superiorities. Finally, simulation examples are given to illustrate our findings.
Author Liu, Bing
Yi, Jingwen
Chai, Li
Author_xml – sequence: 1
  givenname: Bing
  orcidid: 0000-0002-6667-9250
  surname: Liu
  fullname: Liu, Bing
  email: liubing17@wust.edu.cn
  organization: Engineering Research Center of Metallurgical Automation and Measurement Technology, Wuhan University of Science and Technology, Wuhan, China
– sequence: 2
  givenname: Li
  orcidid: 0000-0002-4331-0565
  surname: Chai
  fullname: Chai, Li
  email: chaili@zju.edu.cn
  organization: College of Control Science and Engineering, Zhejiang University, Hangzhou, China
– sequence: 3
  givenname: Jingwen
  orcidid: 0000-0003-0835-5335
  surname: Yi
  fullname: Yi, Jingwen
  email: yijingwen@wust.edu.cn
  organization: Engineering Research Center of Metallurgical Automation and Measurement Technology, Wuhan University of Science and Technology, Wuhan, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/36355726$$D View this record in MEDLINE/PubMed
BookMark eNp9kctq3DAYhUVJadI0D1AKRZBNNzPRxbotp5MmLSRkYyhdGdn-lSrYUirZDXn7ysxkFllUmyOk7xxdznt0FGIAhD5SsqaUmIt6--vrmhHG1pxRLSV_g04YlXrFmBJHh7lUx-gs5wdShi5LRr9Dx1xyIRSTJ6jZxvAX0j2EDvAm2OE5-4yjw5c-T8m38wQ9vk629xAmfAm5W3Qz3Mfkp99jxj-L4LsA2IYe108R38axIPOIa0hj_oDeOjtkONvrKaqvvtXb76ubu-sf283NquOVmVaqq5wWjvNWOaWBWEGJqGTbaUF1ywWtwBjFbcV62VZOWteSVnIH3ErdEX6KvuxiH1P8M0OemtGXqw6DDRDn3DBVMhQ1VBf0_BX6EOdUXl4owyrCBddL4Oc9Nbcj9M1j8qNNz83LzxWA7oAuxZwTuANCSbMU1CwFNUtBzb6g4lGvPJ2f7ORjmJL1w3-dn3ZODwCHk4ypRNnk_wAFG5vJ
CODEN ITCEB8
CitedBy_id crossref_primary_10_1109_TCSII_2024_3435066
crossref_primary_10_1109_TAC_2025_3546087
Cites_doi 10.1090/cbms/092
10.1109/TCNS.2015.2399191
10.1109/cdc.1984.272358
10.1561/2200000016
10.1109/TAC.1983.1103183
10.1109/TSP.2011.2182347
10.1016/j.ifacol.2017.08.1513
10.1109/MSP.2020.2975210
10.1109/TAC.2019.2937496
10.1109/TCOMM.2004.831346
10.1109/TAC.2014.2298712
10.1109/CDC.2009.5400289
10.1109/TAC.2019.2942513
10.1109/TAC.1986.1104412
10.1109/TSP.2016.2544743
10.1017/cbo9781139042918
10.1109/TSP.2014.2304432
10.1109/TCYB.2016.2570808
10.1109/TAC.2008.2009515
10.1007/978-1-4419-8853-9
10.1109/TAC.2010.2041686
10.1016/j.arcontrol.2019.05.006
10.1109/TAC.2017.2730481
10.1109/LCSYS.2017.2722406
10.1109/TAC.2017.2677879
10.1109/TAC.2015.2448011
10.1109/TCYB.2018.2870487
10.1109/TSP.2016.2602803
10.1109/TSP.2011.2146776
10.1016/0041-5553(64)90137-5
10.1137/16M1084316
10.1137/14096668X
10.1109/TCYB.2017.2728644
10.1137/130943170
10.1109/icassp.2016.7472612
10.1109/TCYB.2019.2933003
10.1137/0320018
10.1109/TAC.2013.2253218
10.1109/TCYB.2015.2453167
10.1145/984622.984626
10.1109/TSP.2013.2278149
10.1109/ACC.2000.878579
10.1109/TAC.2010.2091295
10.23919/ACC.2018.8430824
10.1109/tcyb.2020.3011819
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7SC
7SP
7TB
8FD
F28
FR3
H8D
JQ2
L7M
L~C
L~D
7X8
DOI 10.1109/TCYB.2022.3218663
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Mechanical & Transportation Engineering Abstracts
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Aerospace Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList PubMed
MEDLINE - Academic

Aerospace Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Sciences (General)
EISSN 2168-2275
EndPage 1522
ExternalDocumentID 36355726
10_1109_TCYB_2022_3218663
9945633
Genre orig-research
Journal Article
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61903284; 62173259
  funderid: 10.13039/501100001809
GroupedDBID 0R~
4.4
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
AENEX
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
HZ~
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
NPM
RIG
7SC
7SP
7TB
8FD
F28
FR3
H8D
JQ2
L7M
L~C
L~D
7X8
ID FETCH-LOGICAL-c349t-7c4f85f33b7f78e0a510546bc8518b3514e9973a42d6b4f6afb0b63fe3a68c03
IEDL.DBID RIE
ISICitedReferencesCount 4
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000881966000001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2168-2267
2168-2275
IngestDate Sun Sep 28 06:46:47 EDT 2025
Sun Nov 30 05:15:03 EST 2025
Mon Jul 21 05:14:32 EDT 2025
Tue Nov 18 21:14:32 EST 2025
Sat Nov 29 02:02:38 EST 2025
Wed Aug 27 02:02:17 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c349t-7c4f85f33b7f78e0a510546bc8518b3514e9973a42d6b4f6afb0b63fe3a68c03
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-6667-9250
0000-0002-4331-0565
0000-0003-0835-5335
PMID 36355726
PQID 2924035380
PQPubID 85422
PageCount 12
ParticipantIDs pubmed_primary_36355726
proquest_miscellaneous_2735171918
crossref_citationtrail_10_1109_TCYB_2022_3218663
proquest_journals_2924035380
crossref_primary_10_1109_TCYB_2022_3218663
ieee_primary_9945633
PublicationCentury 2000
PublicationDate 2024-03-01
PublicationDateYYYYMMDD 2024-03-01
PublicationDate_xml – month: 03
  year: 2024
  text: 2024-03-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Piscataway
PublicationTitle IEEE transactions on cybernetics
PublicationTitleAbbrev TCYB
PublicationTitleAlternate IEEE Trans Cybern
PublicationYear 2024
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
ref11
ref10
ref17
ref16
ref19
ref18
Nesterov (ref36) 1983; 27
ref51
ref50
ref46
ref45
ref48
ref47
ref42
Wilson (ref40) 2018
ref41
ref43
ref49
ref8
ref7
ref9
ref4
ref3
ref6
ref5
Hu (ref44)
ref34
ref37
ref31
ref30
ref33
ref32
ref2
ref1
ref39
ref38
Polyak (ref35) 1987
ref24
ref23
Yu (ref15)
ref26
ref25
ref20
ref22
ref21
ref28
ref27
ref29
Xie (ref14) 2019
References_xml – ident: ref50
  doi: 10.1090/cbms/092
– ident: ref3
  doi: 10.1109/TCNS.2015.2399191
– ident: ref4
  doi: 10.1109/cdc.1984.272358
– ident: ref27
  doi: 10.1561/2200000016
– ident: ref19
  doi: 10.1109/TAC.1983.1103183
– ident: ref25
  doi: 10.1109/TSP.2011.2182347
– ident: ref45
  doi: 10.1016/j.ifacol.2017.08.1513
– ident: ref17
  doi: 10.1109/MSP.2020.2975210
– ident: ref43
  doi: 10.1109/TAC.2019.2937496
– ident: ref2
  doi: 10.1109/TCOMM.2004.831346
– start-page: 7184
  volume-title: Proc. 36th Int. Conf. Mach. Learn.
  ident: ref15
  article-title: On the linear speedup analysis of communication efficient momentum SGD for distributed non-convex optimization
– ident: ref42
  doi: 10.1109/TAC.2014.2298712
– ident: ref20
  doi: 10.1109/CDC.2009.5400289
– ident: ref49
  doi: 10.1109/TAC.2019.2942513
– ident: ref5
  doi: 10.1109/TAC.1986.1104412
– ident: ref29
  doi: 10.1109/TSP.2016.2544743
– ident: ref23
  doi: 10.1017/cbo9781139042918
– ident: ref28
  doi: 10.1109/TSP.2014.2304432
– ident: ref31
  doi: 10.1109/TCYB.2016.2570808
– ident: ref6
  doi: 10.1109/TAC.2008.2009515
– ident: ref37
  doi: 10.1007/978-1-4419-8853-9
– ident: ref7
  doi: 10.1109/TAC.2010.2041686
– ident: ref16
  doi: 10.1016/j.arcontrol.2019.05.006
– volume-title: arXiv: 1611.02635v4
  year: 2018
  ident: ref40
  article-title: A Lyapunov analysis of momentum methods in optimization
– ident: ref13
  doi: 10.1109/TAC.2017.2730481
– volume-title: Introduction to Optimization
  year: 1987
  ident: ref35
– ident: ref38
  doi: 10.1109/LCSYS.2017.2722406
– ident: ref30
  doi: 10.1109/TAC.2017.2677879
– ident: ref32
  doi: 10.1109/TAC.2015.2448011
– ident: ref26
  doi: 10.1109/TCYB.2018.2870487
– ident: ref33
  doi: 10.1109/TSP.2016.2602803
– ident: ref24
  doi: 10.1109/TSP.2011.2146776
– ident: ref34
  doi: 10.1016/0041-5553(64)90137-5
– ident: ref48
  doi: 10.1137/16M1084316
– ident: ref47
  doi: 10.1137/14096668X
– ident: ref11
  doi: 10.1109/TCYB.2017.2728644
– ident: ref10
  doi: 10.1137/130943170
– ident: ref41
  doi: 10.1109/icassp.2016.7472612
– ident: ref9
  doi: 10.1109/TCYB.2019.2933003
– ident: ref18
  doi: 10.1137/0320018
– ident: ref21
  doi: 10.1109/TAC.2013.2253218
– ident: ref12
  doi: 10.1109/TCYB.2015.2453167
– start-page: 1549
  volume-title: Proc. 34th Int. Conf. Mach. Learn.
  ident: ref44
  article-title: Dissipativity theory for Nesterov’s accelerated method
– ident: ref1
  doi: 10.1145/984622.984626
– ident: ref46
  doi: 10.1109/TSP.2013.2278149
– ident: ref51
  doi: 10.1109/ACC.2000.878579
– volume-title: arXiv:1911.09030
  year: 2019
  ident: ref14
  article-title: Local adaalter: Communication-efficient stochastic gradient descent with adaptive learning rates
– volume: 27
  start-page: 372
  issue: 2
  year: 1983
  ident: ref36
  article-title: A method for solving a convex programming problem with convergence rate O(1/k²)
  publication-title: Soviet Math. Doklady
– ident: ref8
  doi: 10.1109/TAC.2010.2091295
– ident: ref39
  doi: 10.23919/ACC.2018.8430824
– ident: ref22
  doi: 10.1109/tcyb.2020.3011819
SSID ssj0000816898
Score 2.3976398
Snippet For the centralized optimization, it is well known that adding one momentum term (also called the heavy-ball method) can obtain a faster convergence rate than...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1511
SubjectTerms Algorithms
Control theory
Convergence
Convergence rate
Cost function
distributed optimization
Linear programming
Momentum
momentum term
Network topology
Newton method
Privacy
Routh criterion
Signal processing algorithms
Title Convergence Analysis of Distributed Gradient Descent Algorithms With One and Two Momentum Terms
URI https://ieeexplore.ieee.org/document/9945633
https://www.ncbi.nlm.nih.gov/pubmed/36355726
https://www.proquest.com/docview/2924035380
https://www.proquest.com/docview/2735171918
Volume 54
WOSCitedRecordID wos000881966000001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2168-2275
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000816898
  issn: 2168-2267
  databaseCode: RIE
  dateStart: 20130101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwEB21FQcuQCnQQFsZiQMgtnXjxB_HdkvhAIVDBMspipNxW6lN0CYLfx-P4404ABKnRIqTOJkZz4w9fg_gheaNMc4boKi09AmK8uMgUoGNyEVlj7XOA3_Klw_q4kIvFubzBryZ9sIgYig-w0M6DWv5TVevaKrsyBjv7oXYhE2l5LhXa5pPCQQSgfo29SczH1WouIh5zM1RMf926pPBND0URMIkiT5HkK9VhKrwm0cKFCt_jzaD1zm__3_9fQD3YnTJTkZ12IYNbB_CdrTfnr2MINOvdqCcU7l52HmJbI1MwjrHzghJl0iwsGHvlqEgbGBnI-gTO7m57JbXw9Vtz776A_vUIqvahhU_O_aRsByG1S0r_GDfP4Li_G0xfz-LZAuzWmRmmKk6czp3QljllEZeUeSVSVv7kExbqvdHY5SosrSRNnOycpZbKRyKSuqai8ew1XYt7gJzXCNKjqnTNkPRaJkr6xCdT61savIE-Pp_l3UEIic-jJsyJCTclCStkqRVRmkl8Hq65fuIwvGvxjskiqlhlEICe2uhltFO-zI1hEfoB32ewPPpsrcwWjapWuxWvo3yX698XqsTeDIqw_TstQ49_fM7n8Fd37NsrFnbg61hucJ9uFP_GK775YFX44U-CGr8C2yM6hI
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB61BalcgFIegQJG4gAV27px4sexbClFbBcOEW1PUZyMoVKboE0W_j4eJxtxACROiRQncTIznhl7_H0ALzWvjHHeAEWhpU9QlB8HkQpsRCoKe6B1GvhTvszUfK7Pz83nNXgz7oVBxFB8hnt0Gtbyq6Zc0lTZvjHe3QuxDjfSJIl5v1trnFEJFBKB_Db2JxMfV6hhGfOAm_1sevHWp4NxvCeIhkkSgY4gb6sIV-E3nxRIVv4ebwa_c3zn_3p8F24P8SU77BViC9awvgdbgwW37NUAM_16G_IpFZyHvZfIVtgkrHHsiLB0iQYLK_Z-EUrCOnbUwz6xw6uvzeKy-3bdsjN_YJ9qZEVdsexnw04JzaFbXrPMD_ftfciO32XTk8lAtzApRWK6iSoTp1MnhFVOaeQFxV6JtKUPyrSlin80RokiiStpEycLZ7mVwqEopC65eAAbdVPjI2COa0TJMXbaJigqLVNlHaLzyZWNTRoBX_3vvBygyIkR4yoPKQk3OUkrJ2nlg7Qi2B1v-d7jcPyr8TaJYmw4SCGCnZVQ88FS2zw2hEjoh30ewYvxsrcxWjgpamyWvo3yX698ZqsjeNgrw_jslQ49_vM7n8PmSXY6y2cf5h-fwC3fy6SvYNuBjW6xxKdws_zRXbaLZ0GZfwE6pexx
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Convergence+Analysis+of+Distributed+Gradient+Descent+Algorithms+With+One+and+Two+Momentum+Terms&rft.jtitle=IEEE+transactions+on+cybernetics&rft.au=Liu%2C+Bing&rft.au=Chai%2C+Li&rft.au=Yi%2C+Jingwen&rft.date=2024-03-01&rft.pub=IEEE&rft.issn=2168-2267&rft.volume=54&rft.issue=3&rft.spage=1511&rft.epage=1522&rft_id=info:doi/10.1109%2FTCYB.2022.3218663&rft_id=info%3Apmid%2F36355726&rft.externalDocID=9945633
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2168-2267&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2168-2267&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2168-2267&client=summon