First-order methods of smooth convex optimization with inexact oracle

We introduce the notion of inexact first-order oracle and analyze the behavior of several first-order methods of smooth convex optimization used with such an oracle. This notion of inexact oracle naturally appears in the context of smoothing techniques, Moreau–Yosida regularization, Augmented Lagran...

Full description

Saved in:
Bibliographic Details
Published in:Mathematical programming Vol. 146; no. 1-2; pp. 37 - 75
Main Authors: Devolder, Olivier, Glineur, François, Nesterov, Yurii
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.08.2014
Springer Nature B.V
Subjects:
ISSN:0025-5610, 1436-4646
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We introduce the notion of inexact first-order oracle and analyze the behavior of several first-order methods of smooth convex optimization used with such an oracle. This notion of inexact oracle naturally appears in the context of smoothing techniques, Moreau–Yosida regularization, Augmented Lagrangians and many other situations. We derive complexity estimates for primal, dual and fast gradient methods, and study in particular their dependence on the accuracy of the oracle and the desired accuracy of the objective function. We observe that the superiority of fast gradient methods over the classical ones is no longer absolute when an inexact oracle is used. We prove that, contrary to simple gradient schemes, fast gradient methods must necessarily suffer from error accumulation. Finally, we show that the notion of inexact oracle allows the application of first-order methods of smooth convex optimization to solve non-smooth or weakly smooth convex problems.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0025-5610
1436-4646
DOI:10.1007/s10107-013-0677-5