Compressed gradient tracking algorithms for distributed nonconvex optimization

In this paper, we study the distributed nonconvex optimization problem, aiming to minimize the average value of the local nonconvex cost functions using local information exchange. To reduce the communication overhead, we introduce three general classes of compressors, i.e., compressors with bounded...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Automatica (Oxford) Ročník 177; s. 112286
Hlavní autoři: Xu, Lei, Yi, Xinlei, Wen, Guanghui, Shi, Yang, Johansson, Karl H., Yang, Tao
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 01.07.2025
Témata:
ISSN:0005-1098, 1873-2836
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we study the distributed nonconvex optimization problem, aiming to minimize the average value of the local nonconvex cost functions using local information exchange. To reduce the communication overhead, we introduce three general classes of compressors, i.e., compressors with bounded relative compression error, compressors with globally bounded absolute compression error, and compressors with locally bounded absolute compression error. By integrating them, respectively, with the distributed gradient tracking algorithm, we then propose three corresponding compressed distributed nonconvex optimization algorithms. Motivated by the state-of-the-art BEER algorithm proposed in Zhao et al. (2022), which is an efficient compressed algorithm integrating gradient tracking with biased and contractive compressors, our first proposed algorithm extends this algorithm to accommodate both biased and non-contractive compressors For each algorithm, we design a novel Lyapunov function to demonstrate its sublinear convergence to a stationary point if the local cost functions are smooth. Furthermore, when the global cost function satisfies the Polyak–Łojasiewicz (P–Ł) condition, we show that our proposed algorithms linearly converge to a global optimal point. It is worth noting that, for compressors with bounded relative compression error and globally bounded absolute compression error, our proposed algorithms’ parameters do not require prior knowledge of the P–Ł constant.
ISSN:0005-1098
1873-2836
DOI:10.1016/j.automatica.2025.112286