Minimizing finite sums with the stochastic average gradient

We analyze the stochastic average gradient (SAG) method for optimizing the sum of a finite number of smooth convex functions. Like stochastic gradient (SG) methods, the SAG method’s iteration cost is independent of the number of terms in the sum. However, by incorporating a memory of previous gradie...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Mathematical programming Ročník 162; číslo 1-2; s. 83 - 112
Hlavní autoři: Schmidt, Mark, Le Roux, Nicolas, Bach, Francis
Médium: Journal Article
Jazyk:angličtina
Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.03.2017
Springer Nature B.V
Springer Verlag
Témata:
ISSN:0025-5610, 1436-4646
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We analyze the stochastic average gradient (SAG) method for optimizing the sum of a finite number of smooth convex functions. Like stochastic gradient (SG) methods, the SAG method’s iteration cost is independent of the number of terms in the sum. However, by incorporating a memory of previous gradient values the SAG method achieves a faster convergence rate than black-box SG methods. The convergence rate is improved from O ( 1 / k ) to O (1 /  k ) in general, and when the sum is strongly-convex the convergence rate is improved from the sub-linear O (1 /  k ) to a linear convergence rate of the form O ( ρ k ) for ρ < 1 . Further, in many cases the convergence rate of the new method is also faster than black-box deterministic gradient methods, in terms of the number of gradient evaluations. This extends our earlier work Le Roux et al. (Adv Neural Inf Process Syst, 2012 ), which only lead to a faster rate for well-conditioned strongly-convex problems. Numerical experiments indicate that the new algorithm often dramatically outperforms existing SG and deterministic gradient methods, and that the performance may be further improved through the use of non-uniform sampling strategies.
Bibliografie:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0025-5610
1436-4646
DOI:10.1007/s10107-016-1030-6