Fast convex optimization via inertial dynamics with Hessian driven damping

We first study the fast minimization properties of the trajectories of the second-order evolution equationx¨(t)+αtx˙(t)+β∇2Φ(x(t))x˙(t)+∇Φ(x(t))=0, where Φ:H→R is a smooth convex function acting on a real Hilbert space H, and α, β are positive parameters. This inertial system combines an isotropic v...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Journal of Differential Equations Ročník 261; číslo 10; s. 5734 - 5783
Hlavní autori: Attouch, Hedy, Peypouquet, Juan, Redont, Patrick
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier Inc 15.11.2016
Elsevier
Predmet:
ISSN:0022-0396, 1090-2732
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:We first study the fast minimization properties of the trajectories of the second-order evolution equationx¨(t)+αtx˙(t)+β∇2Φ(x(t))x˙(t)+∇Φ(x(t))=0, where Φ:H→R is a smooth convex function acting on a real Hilbert space H, and α, β are positive parameters. This inertial system combines an isotropic viscous damping which vanishes asymptotically, and a geometrical Hessian driven damping, which makes it naturally related to Newton's and Levenberg–Marquardt methods. For α≥3, and β>0, along any trajectory, fast convergence of the valuesΦ(x(t))−minH⁡Φ=O(t−2) is obtained, together with rapid convergence of the gradients ∇Φ(x(t)) to zero. For α>3, just assuming that argminΦ≠∅, we show that any trajectory converges weakly to a minimizer of Φ, and that Φ(x(t))−minH⁡Φ=o(t−2). Strong convergence is established in various practical situations. In particular, for the strongly convex case, we obtain an even faster speed of convergence which can be arbitrarily fast depending on the choice of α. More precisely, we have Φ(x(t))−minH⁡Φ=O(t−23α). Then, we extend the results to the case of a general proper lower-semicontinuous convex function Φ:H→R∪{+∞}. This is based on the crucial property that the inertial dynamics with Hessian driven damping can be equivalently written as a first-order system in time and space, allowing to extend it by simply replacing the gradient with the subdifferential. By explicit–implicit time discretization, this opens a gate to new − possibly more rapid − inertial algorithms, expanding the field of FISTA methods for convex structured optimization problems.
ISSN:0022-0396
1090-2732
DOI:10.1016/j.jde.2016.08.020