Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data

Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the data size is large and disjointly stored on different machines, it becomes imperative to distribute the implementation of such variance reduce...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:arXiv.org
Hlavní autoři: Cen, Shicong, Zhang, Huishuai, Chi, Yuejie, Chen, Wei, Tie-Yan, Liu
Médium: Paper
Jazyk:angličtina
Vydáno: Ithaca Cornell University Library, arXiv.org 09.07.2020
Témata:
ISSN:2331-8422
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the data size is large and disjointly stored on different machines, it becomes imperative to distribute the implementation of such variance reduced methods. In this paper, we consider a general framework that directly distributes popular stochastic variance reduced methods in the master/slave model, by assigning outer loops to the parameter server, and inner loops to worker machines. This framework is natural and friendly to implement, but its theoretical convergence is not well understood. We obtain a comprehensive understanding of algorithmic convergence with respect to data homogeneity by measuring the smoothness of the discrepancy between the local and global loss functions. We establish the linear convergence of distributed versions of a family of stochastic variance reduced algorithms, including those using accelerated and recursive gradient updates, for minimizing strongly convex losses. Our theory captures how the convergence of distributed algorithms behaves as the number of machines and the size of local data vary. Furthermore, we show that when the data are less balanced, regularization can be used to ensure convergence at a slower rate. We also demonstrate that our analysis can be further extended to handle nonconvex loss functions.
AbstractList Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the data size is large and disjointly stored on different machines, it becomes imperative to distribute the implementation of such variance reduced methods. In this paper, we consider a general framework that directly distributes popular stochastic variance reduced methods in the master/slave model, by assigning outer loops to the parameter server, and inner loops to worker machines. This framework is natural and friendly to implement, but its theoretical convergence is not well understood. We obtain a comprehensive understanding of algorithmic convergence with respect to data homogeneity by measuring the smoothness of the discrepancy between the local and global loss functions. We establish the linear convergence of distributed versions of a family of stochastic variance reduced algorithms, including those using accelerated and recursive gradient updates, for minimizing strongly convex losses. Our theory captures how the convergence of distributed algorithms behaves as the number of machines and the size of local data vary. Furthermore, we show that when the data are less balanced, regularization can be used to ensure convergence at a slower rate. We also demonstrate that our analysis can be further extended to handle nonconvex loss functions.
Author Chen, Wei
Chi, Yuejie
Cen, Shicong
Zhang, Huishuai
Tie-Yan, Liu
Author_xml – sequence: 1
  givenname: Shicong
  surname: Cen
  fullname: Cen, Shicong
– sequence: 2
  givenname: Huishuai
  surname: Zhang
  fullname: Zhang, Huishuai
– sequence: 3
  givenname: Yuejie
  surname: Chi
  fullname: Chi, Yuejie
– sequence: 4
  givenname: Wei
  surname: Chen
  fullname: Chen, Wei
– sequence: 5
  givenname: Liu
  surname: Tie-Yan
  fullname: Tie-Yan, Liu
BookMark eNotjctOAjEYRhujiYg8gLsmrgf_3mbapQG8JBgTIS7ckN4GSnCKbQd5fDG6Oovv5DtX6LyLnUfohsCYSyHgTqdjOIyJAjEmtObyDA0oY6SSnNJLNMp5CwC0bqgQbIA-JrE7-LT2nfU4tngacknB9MU7vCjRbnQuweJ3nYL-Vd686-1pe_FlE13G3-HEvuCF_tzvQrfGs2NJGk910dfootW77Ef_HKLlw2w5earmr4_Pk_t5pQVVVWu80g1TYJ3mjrtGWmlqaYi0rbdgHAHZWCDQgCCskZx7UMYIZywR1Ao2RLd_t_sUv3qfy2ob-9SdiitKGVUSaqXYDyKHVnk
ContentType Paper
Copyright 2020. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2020. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
DOI 10.48550/arxiv.1905.12648
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
Technology collection
ProQuest One Community College
ProQuest Central
SciTech Premium Collection
ProQuest Engineering Collection
Engineering Database
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
Engineering Collection
DatabaseTitleList Publicly Available Content Database
Database_xml – sequence: 1
  dbid: PIMPY
  name: Publicly Available Content Database
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
ID FETCH-LOGICAL-a529-fbe9a7390cda4d4d78c8b68b18cfec0bd1087c010705137844e09bb5dbc152c53
IEDL.DBID M7S
IngestDate Mon Jun 30 09:32:19 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-a529-fbe9a7390cda4d4d78c8b68b18cfec0bd1087c010705137844e09bb5dbc152c53
Notes SourceType-Working Papers-1
ObjectType-Working Paper/Pre-Print-1
content type line 50
OpenAccessLink https://www.proquest.com/docview/2232980699?pq-origsite=%requestingapplication%
PQID 2232980699
PQPubID 2050157
ParticipantIDs proquest_journals_2232980699
PublicationCentury 2000
PublicationDate 20200709
PublicationDateYYYYMMDD 2020-07-09
PublicationDate_xml – month: 07
  year: 2020
  text: 20200709
  day: 09
PublicationDecade 2020
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2020
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 1.7276158
SecondaryResourceType preprint
Snippet Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the...
SourceID proquest
SourceType Aggregation Database
SubjectTerms Algorithms
Convergence
Empirical analysis
Regularization
Sampling methods
Smoothness
Title Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data
URI https://www.proquest.com/docview/2232980699
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1JSwMxGA3aKnhyx6WWHLxOO2uWk2AXFGwZ2iLVS8mqvXTqzLT055vEqR4ELx7mMISBIR_5vnwvL-8BcCsjESufSY9ojbxYJ9yjWMWepoqhQCGzCXEXhZ_wcEimU5pWgFtR0Sq3OdElapkJi5G3TRkLKfERpXfLD8-6RtnT1cpCYxfUrUpC4Kh742-MJUTY7Jijr8NMJ93VZvlmvm6ZKpi0Akvu-pWCXV3pH_73j45APWVLlR-DHbU4AfuOzymKU_DasWxyd7FSwUzDrtXHtdZWSsJxmYl3ZvWZ4bPplG3Y4cgquJqxgTOULqCFZ7NVCcfMMs4Xb7C3KXMGu6xkZ2DS7006D15lo-CxJKSe5ooyHFFfSBbLWGIiCEeEB0RoJXwuA59gYdoybNZnhElsYkc5TyQXpraLJDoHtUW2UBcA6lCzJDYfo8h0UUzwUCCtKTYPxYLgS9DYztSsWgrF7Gearv4evgYHoW1mLXZKG6BW5it1A_bEupwXeRPU73vDdNR0ETZv6eMgffkEsbmzdQ
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V25TgMxELVQAEHFLY4ALqBckji7a7tAFAlRohyKSIQQTeQT0mTD7nJ9FP_I2CFQINGloHBlufDO7Fx-Mw-hM11VoSkLHTBr4yC0kQw4NWFguRFxxcQQhPhG4Q7t9djdHe8voY95L4yDVc5tojfUOlGuRl4CN0Y4K8ecX02fAsca5V5X5xQaM7Vom_dXSNmyy1Yd5HtOSON6WGsGX6wCgYgID6w0XFDI9JUWoQ41ZYrJmMkKU9aostSVMqMKshQK6lqlLISrcCkjLRW4OuVIIsDiL0MUQbhHCg6-SzokphCgV2dvp35SWEmkb-OXC3C60UXFYcl-WXzvxhob_-wDbKLlvpiadAstmck2WvVoVZXtoPuaw8r7tlGDE4vrbvqvI-4yGg_yRD0KN30a34rUlWwMvnHzaWGv6-myM-yKz8lzjgfC4eknD_j6LU8Frotc7KLhIm6zhwqTZGL2EbbEiiiEw3EVckShJFGxtZzC4lQxeoCKc8GMvn70bPQjlcO_t0_RWnPY7Yw6rV77CK0Tl7a7KjEvokKePptjtKJe8nGWnnilwmi0YBl-Aop4DH4
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Convergence+of+Distributed+Stochastic+Variance+Reduced+Methods+without+Sampling+Extra+Data&rft.jtitle=arXiv.org&rft.au=Cen%2C+Shicong&rft.au=Zhang%2C+Huishuai&rft.au=Chi%2C+Yuejie&rft.au=Chen%2C+Wei&rft.date=2020-07-09&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422&rft_id=info:doi/10.48550%2Farxiv.1905.12648