An Analysis Tool for Push-Sum Based Distributed Optimization

This paper establishes the explicit absolute probability sequence for the push-sum algorithm, and based on which, constructs quadratic Lyapunov functions for push-sum based distributed optimization algorithms. As illustrative examples, the proposed novel analysis tool can establish optimal convergen...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on automatic control s. 1 - 8
Hlavní autori: Lin, Yixuan, Zhu, Zeru, Liu, Ji
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: IEEE 2025
Predmet:
ISSN:0018-9286, 1558-2523
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:This paper establishes the explicit absolute probability sequence for the push-sum algorithm, and based on which, constructs quadratic Lyapunov functions for push-sum based distributed optimization algorithms. As illustrative examples, the proposed novel analysis tool can establish optimal convergence rates for the subgradient-push and stochastic gradient-push, two important algorithms for distributed convex optimization over directed graphs. Specifically, the paper proves that the subgradient-push algorithm with a constant stepsize for finite <inline-formula><tex-math notation="LaTeX">T</tex-math></inline-formula> steps converges at a rate of <inline-formula><tex-math notation="LaTeX">O(1/\sqrt{T})</tex-math></inline-formula> for general convex functions, and the stochastic gradient-push algorithm with a time <inline-formula><tex-math notation="LaTeX">t</tex-math></inline-formula> dependent diminishing stepsize converges at a rate of <inline-formula><tex-math notation="LaTeX">O(1/t)</tex-math></inline-formula> for strongly convex functions over time-varying directed graphs. Both rates are respectively the same as the state-of-the-art rates of their single-agent counterparts and thus optimal.
ISSN:0018-9286
1558-2523
DOI:10.1109/TAC.2025.3596669