An Improved Distributed Nesterov Gradient Tracking Algorithm for Smooth Convex Optimization Over Directed Networks

This article explores the problem of distributed optimization for functions that are smooth and nonstrongly convex over directed networks. To address this issue, an improved distributed Nesterov gradient tracking (IDNGT) algorithm is proposed, which utilizes the adapt-then-combine rule and row-stoch...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automatic control Jg. 70; H. 4; S. 2738 - 2745
Hauptverfasser: Lin, Yifu, Li, Wenling, Zhang, Bin, Du, Junping
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York IEEE 01.04.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:0018-9286, 1558-2523
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This article explores the problem of distributed optimization for functions that are smooth and nonstrongly convex over directed networks. To address this issue, an improved distributed Nesterov gradient tracking (IDNGT) algorithm is proposed, which utilizes the adapt-then-combine rule and row-stochastic weights. A main novelty of the proposed algorithm is the introduction of a scale factor into the gradient tracking scheme to suppress the consensus error. By the estimate sequence approach, the dynamics of the error due to the unbalance of directed networks is analyzed and it is shown that a sublinear convergence rate can be achieved with a vanishing step size. Numerical results suggest that the performance of IDNGT is comparable to that of the centralized Nesterov gradient descent algorithm.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9286
1558-2523
DOI:10.1109/TAC.2024.3492329