Multi-stage stochastic gradient method with momentum acceleration
•Stage-wise optimization and momentum have been widely employed to accelerate SGD.•Negative momentum provides acceleration and stabilization on stochastic first-order methods.•Negative momentum extends Nesterovs momentum to the stage-wise optimization.•Gradient correction avoids the oscillations and...
Saved in:
| Published in: | Signal processing Vol. 188; p. 108201 |
|---|---|
| Main Authors: | , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Elsevier B.V
01.11.2021
|
| Subjects: | |
| ISSN: | 0165-1684, 1872-7557 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Be the first to leave a comment!