Step Size Adaptation for Accelerated Stochastic Momentum Algorithm Using SDE Modeling and Lyapunov Drift Minimization
Training machine learning models often involves solving high-dimensional stochastic optimization problems, where stochastic gradient-based algorithms are hindered by slow convergence. Although momentum-based methods perform well in deterministic settings, their effectiveness diminishes under gradien...
Uloženo v:
| Vydáno v: | IEEE transactions on signal processing Ročník 73; s. 3124 - 3139 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
New York
IEEE
2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 1053-587X, 1941-0476 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Training machine learning models often involves solving high-dimensional stochastic optimization problems, where stochastic gradient-based algorithms are hindered by slow convergence. Although momentum-based methods perform well in deterministic settings, their effectiveness diminishes under gradient noise. In this paper, we introduce a novel accelerated stochastic momentum algorithm. Specifically, we first model the trajectory of discrete-time momentum-based algorithms using continuous-time stochastic differential equations (SDEs). By leveraging a tailored Lyapunov function, we derive 2-D adaptive step sizes through Lyapunov drift minimization, which significantly enhance both convergence speed and noise stability. The proposed algorithm not only accelerates convergence but also eliminates the need for hyperparameter fine-tuning, consistently achieving robust accuracy in machine learning tasks. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 1053-587X 1941-0476 |
| DOI: | 10.1109/TSP.2025.3592678 |