Step Size Adaptation for Accelerated Stochastic Momentum Algorithm Using SDE Modeling and Lyapunov Drift Minimization

Training machine learning models often involves solving high-dimensional stochastic optimization problems, where stochastic gradient-based algorithms are hindered by slow convergence. Although momentum-based methods perform well in deterministic settings, their effectiveness diminishes under gradien...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on signal processing Vol. 73; pp. 3124 - 3139
Main Authors: Yuan, Yulan, Tsang, Danny H. K., Lau, Vincent K. N.
Format: Journal Article
Language:English
Published: New York IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1053-587X, 1941-0476
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Training machine learning models often involves solving high-dimensional stochastic optimization problems, where stochastic gradient-based algorithms are hindered by slow convergence. Although momentum-based methods perform well in deterministic settings, their effectiveness diminishes under gradient noise. In this paper, we introduce a novel accelerated stochastic momentum algorithm. Specifically, we first model the trajectory of discrete-time momentum-based algorithms using continuous-time stochastic differential equations (SDEs). By leveraging a tailored Lyapunov function, we derive 2-D adaptive step sizes through Lyapunov drift minimization, which significantly enhance both convergence speed and noise stability. The proposed algorithm not only accelerates convergence but also eliminates the need for hyperparameter fine-tuning, consistently achieving robust accuracy in machine learning tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2025.3592678