Almost Sure Convergence and Non-Asymptotic Concentration Bounds for Stochastic Mirror Descent Algorithm

This letter investigates the convergence and concentration properties of the Stochastic Mirror Descent (SMD) algorithm utilizing biased stochastic subgradients. We establish the almost sure convergence of the algorithm's iterates under the assumption of diminishing bias. Furthermore, we derive...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE control systems letters Ročník 8; s. 2397 - 2402
Hlavní autoři: Paul, Anik Kumar, Mahindrakar, Arun D., Kalaimani, Rachel K.
Médium: Journal Article
Jazyk:angličtina
Vydáno: IEEE 2024
Témata:
ISSN:2475-1456, 2475-1456
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This letter investigates the convergence and concentration properties of the Stochastic Mirror Descent (SMD) algorithm utilizing biased stochastic subgradients. We establish the almost sure convergence of the algorithm's iterates under the assumption of diminishing bias. Furthermore, we derive concentration bounds for the discrepancy between the iterates' function values and the optimal value, based on standard assumptions. Subsequently, leveraging the assumption of Sub-Gaussian noise in stochastic subgradients, we present refined concentration bounds for this discrepancy.
ISSN:2475-1456
2475-1456
DOI:10.1109/LCSYS.2024.3482148