A Computation-Efficient Decentralized Algorithm for Composite Constrained Optimization

This paper focuses on solving the problem of composite constrained convex optimization with a sum of smooth convex functions and non-smooth regularization terms (ℓ 1 norm) subject to locally general constraints. Motivated by the modern large-scale information processing problems in machine learning...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on signal and information processing over networks Jg. 6; S. 774 - 789
Hauptverfasser: Lu, Qingguo, Liao, Xiaofeng, Li, Huaqing, Huang, Tingwen
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:2373-776X, 2373-7778
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper focuses on solving the problem of composite constrained convex optimization with a sum of smooth convex functions and non-smooth regularization terms (ℓ 1 norm) subject to locally general constraints. Motivated by the modern large-scale information processing problems in machine learning (the samples of a training dataset are randomly decentralized across multiple computing nodes), each of the smooth objective functions is further considered as the average of several constituent functions. To address the problem in a decentralized fashion, we propose a novel computation-efficient decentralized stochastic gradient algorithm, which leverages the variance reduction technique and the decentralized stochastic gradient projection method with constant step-size. Theoretical analysis indicates that if the constant step-size is less than an explicitly estimated upper bound, the proposed algorithm can find the exact optimal solution in expectation when each constituent function (smooth) is strongly convex. Concerning the existing decentralized schemes, the proposed algorithm not only is suitable for solving the general constrained optimization problems but also possesses low computation cost in terms of the total number of local gradient evaluations. Furthermore, the proposed algorithm via differential privacy strategy can effectively mask the privacy of each constituent function, which is more practical in applications involving sensitive messages, such as military affairs or medical treatment. Finally, numerical evidence is provided to demonstrate the appealing performance of the proposed algorithm.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2373-776X
2373-7778
DOI:10.1109/TSIPN.2020.3037837