Single-Loop Variance-Reduced Stochastic Algorithm for Nonconvex-Concave Minimax Optimization
Nonconvex-concave (NC-C) finite-sum minimax problems have broad applications in decentralized optimization and various machine learning tasks. However, the nonsmooth nature of NC-C problems makes it challenging to design effective variance reduction techniques. Existing vanilla stochastic algorithms...
Uloženo v:
| Vydáno v: | Proceedings of the ... IEEE International Conference on Acoustics, Speech and Signal Processing (1998) s. 1 - 5 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
06.04.2025
|
| Témata: | |
| ISSN: | 2379-190X |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Nonconvex-concave (NC-C) finite-sum minimax problems have broad applications in decentralized optimization and various machine learning tasks. However, the nonsmooth nature of NC-C problems makes it challenging to design effective variance reduction techniques. Existing vanilla stochastic algorithms using uniform samples for gradient estimation often exhibit slow convergence rates and require bounded variance assumptions. In this paper, we develop a novel probabilistic variance reduction updating scheme and propose a single-loop algorithm called the probabilistic variance-reduced smoothed gradient descent-ascent (PVR-SGDA) algorithm. The proposed algorithm achieves an iteration complexity of {\mathcal{O}}\left({{\varepsilon ^{ - 4}}}\right), surpassing the best-known rates of stochastic algorithms for NC-C minimax problems and matching the performance of the best deterministic algorithms in this context. Finally, we demonstrate the effectiveness of the proposed algorithm through numerical simulations. |
|---|---|
| ISSN: | 2379-190X |
| DOI: | 10.1109/ICASSP49660.2025.10889086 |