On the Proximal Gradient Algorithm with Alternated Inertia
In this paper, we investigate attractive properties of the proximal gradient algorithm with inertia. Notably, we show that using alternated inertia yields monotonically decreasing functional values, which contrasts with usual accelerated proximal gradient methods. We also provide convergence rates f...
Saved in:
| Published in: | Journal of optimization theory and applications Vol. 176; no. 3; pp. 688 - 710 |
|---|---|
| Main Authors: | , |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York
Springer US
01.03.2018
Springer Nature B.V Springer Verlag |
| Subjects: | |
| ISSN: | 0022-3239, 1573-2878 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In this paper, we investigate attractive properties of the proximal gradient algorithm with inertia. Notably, we show that using
alternated inertia
yields monotonically decreasing functional values, which contrasts with usual accelerated proximal gradient methods. We also provide convergence rates for the algorithm with alternated inertia, based on local geometric properties of the objective function. The results are put into perspective by discussions on several extensions (strongly convex case, non-convex case, and alternated extrapolation) and illustrations on common regularized optimization problems. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0022-3239 1573-2878 |
| DOI: | 10.1007/s10957-018-1226-4 |