Compressed gradient tracking algorithms for distributed nonconvex optimization

In this paper, we study the distributed nonconvex optimization problem, aiming to minimize the average value of the local nonconvex cost functions using local information exchange. To reduce the communication overhead, we introduce three general classes of compressors, i.e., compressors with bounded...

Full description

Saved in:
Bibliographic Details
Published in:Automatica (Oxford) Vol. 177; p. 112286
Main Authors: Xu, Lei, Yi, Xinlei, Wen, Guanghui, Shi, Yang, Johansson, Karl H., Yang, Tao
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.07.2025
Subjects:
ISSN:0005-1098, 1873-2836
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we study the distributed nonconvex optimization problem, aiming to minimize the average value of the local nonconvex cost functions using local information exchange. To reduce the communication overhead, we introduce three general classes of compressors, i.e., compressors with bounded relative compression error, compressors with globally bounded absolute compression error, and compressors with locally bounded absolute compression error. By integrating them, respectively, with the distributed gradient tracking algorithm, we then propose three corresponding compressed distributed nonconvex optimization algorithms. Motivated by the state-of-the-art BEER algorithm proposed in Zhao et al. (2022), which is an efficient compressed algorithm integrating gradient tracking with biased and contractive compressors, our first proposed algorithm extends this algorithm to accommodate both biased and non-contractive compressors For each algorithm, we design a novel Lyapunov function to demonstrate its sublinear convergence to a stationary point if the local cost functions are smooth. Furthermore, when the global cost function satisfies the Polyak–Łojasiewicz (P–Ł) condition, we show that our proposed algorithms linearly converge to a global optimal point. It is worth noting that, for compressors with bounded relative compression error and globally bounded absolute compression error, our proposed algorithms’ parameters do not require prior knowledge of the P–Ł constant.
ISSN:0005-1098
1873-2836
DOI:10.1016/j.automatica.2025.112286