Convergence rate analysis of distributed optimization with projected subgradient algorithm

In this paper, we revisit the consensus-based projected subgradient algorithm proposed for a common set constraint. We show that the commonly adopted non-summable and square-summable diminishing step sizes of subgradients can be relaxed to be only non-summable, if the constrained optimum set is boun...

Full description

Saved in:
Bibliographic Details
Published in:Automatica (Oxford) Vol. 83; pp. 162 - 169
Main Authors: Liu, Shuai, Qiu, Zhirong, Xie, Lihua
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.09.2017
Subjects:
ISSN:0005-1098, 1873-2836
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we revisit the consensus-based projected subgradient algorithm proposed for a common set constraint. We show that the commonly adopted non-summable and square-summable diminishing step sizes of subgradients can be relaxed to be only non-summable, if the constrained optimum set is bounded. More importantly, for a strongly convex aggregate cost with different types of step sizes, we provide a systematical analysis to derive the asymptotic upper bound of convergence rates in terms of the optimum residual, and select the best step sizes accordingly. Our result shows that a convergence rate of O(1∕k) can be achieved with a step size O(1∕k).
ISSN:0005-1098
1873-2836
DOI:10.1016/j.automatica.2017.06.011