Step Size Adaptation for Accelerated Stochastic Momentum Algorithm Using SDE Modeling and Lyapunov Drift Minimization

Training machine learning models often involves solving high-dimensional stochastic optimization problems, where stochastic gradient-based algorithms are hindered by slow convergence. Although momentum-based methods perform well in deterministic settings, their effectiveness diminishes under gradien...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on signal processing Ročník 73; s. 3124 - 3139
Hlavní autori: Yuan, Yulan, Tsang, Danny H. K., Lau, Vincent K. N.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1053-587X, 1941-0476
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Training machine learning models often involves solving high-dimensional stochastic optimization problems, where stochastic gradient-based algorithms are hindered by slow convergence. Although momentum-based methods perform well in deterministic settings, their effectiveness diminishes under gradient noise. In this paper, we introduce a novel accelerated stochastic momentum algorithm. Specifically, we first model the trajectory of discrete-time momentum-based algorithms using continuous-time stochastic differential equations (SDEs). By leveraging a tailored Lyapunov function, we derive 2-D adaptive step sizes through Lyapunov drift minimization, which significantly enhance both convergence speed and noise stability. The proposed algorithm not only accelerates convergence but also eliminates the need for hyperparameter fine-tuning, consistently achieving robust accuracy in machine learning tasks.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2025.3592678