The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions

In this paper, we study the convergence rate of the gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an L -smooth function. We establish a new convergence rate, and show that the bound may be exact in some cases, in particular when all step lengths lie...

Full description

Saved in:
Bibliographic Details
Published in:Optimization letters Vol. 16; no. 6; pp. 1649 - 1661
Main Authors: Abbaszadehpeivasti, Hadi, de Klerk, Etienne, Zamani, Moslem
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.07.2022
Subjects:
ISSN:1862-4472, 1862-4480
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we study the convergence rate of the gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an L -smooth function. We establish a new convergence rate, and show that the bound may be exact in some cases, in particular when all step lengths lie in the interval (0, 1/ L ]. In addition, we derive an optimal step length with respect to the new bound.
ISSN:1862-4472
1862-4480
DOI:10.1007/s11590-021-01821-1