Minimizing finite sums with the stochastic average gradient
We analyze the stochastic average gradient (SAG) method for optimizing the sum of a finite number of smooth convex functions. Like stochastic gradient (SG) methods, the SAG method’s iteration cost is independent of the number of terms in the sum. However, by incorporating a memory of previous gradie...
Saved in:
| Published in: | Mathematical programming Vol. 162; no. 1-2; pp. 83 - 112 |
|---|---|
| Main Authors: | , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.03.2017
Springer Nature B.V Springer Verlag |
| Subjects: | |
| ISSN: | 0025-5610, 1436-4646 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Be the first to leave a comment!