Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward–backward splitting, and regularized Gauss–Seidel methods

In view of the minimization of a nonsmooth nonconvex function f , we prove an abstract convergence result for descent methods satisfying a sufficient-decrease assumption, and allowing a relative error tolerance. Our result guarantees the convergence of bounded sequences, under the assumption that th...

Full description

Saved in:
Bibliographic Details
Published in:Mathematical programming Vol. 137; no. 1-2; pp. 91 - 129
Main Authors: Attouch, Hedy, Bolte, Jérôme, Svaiter, Benar Fux
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer-Verlag 01.02.2013
Springer Nature B.V
Subjects:
ISSN:0025-5610, 1436-4646
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In view of the minimization of a nonsmooth nonconvex function f , we prove an abstract convergence result for descent methods satisfying a sufficient-decrease assumption, and allowing a relative error tolerance. Our result guarantees the convergence of bounded sequences, under the assumption that the function f satisfies the Kurdyka–Łojasiewicz inequality. This assumption allows to cover a wide range of problems, including nonsmooth semi-algebraic (or more generally tame) minimization. The specialization of our result to different kinds of structured problems provides several new convergence results for inexact versions of the gradient method, the proximal method, the forward–backward splitting algorithm, the gradient projection and some proximal regularization of the Gauss–Seidel method in a nonconvex setting. Our results are illustrated through feasibility problems, or iterative thresholding procedures for compressive sensing.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0025-5610
1436-4646
DOI:10.1007/s10107-011-0484-9