Noise amplifiation of momentum-based optimization algorithms
We study momentum-based first-order optimization algorithms in which the iterations utilize information from the two previous steps and are subject to additive white noise. For strongly convex quadratic problems, we utilize Jury stability criterion to provide a novel geometric characterization of li...
Gespeichert in:
| Veröffentlicht in: | Proceedings of the American Control Conference S. 849 - 854 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Tagungsbericht |
| Sprache: | Englisch |
| Veröffentlicht: |
American Automatic Control Council
31.05.2023
|
| Schlagworte: | |
| ISSN: | 2378-5861 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | We study momentum-based first-order optimization algorithms in which the iterations utilize information from the two previous steps and are subject to additive white noise. For strongly convex quadratic problems, we utilize Jury stability criterion to provide a novel geometric characterization of linear convergence and exploit this insight to derive alternative proofs of standard convergence results and identify fundamental performance tradeoffs. We use the steady-state variance of the error in the optimization variable to quantify noise amplification and establish analytical lower bounds on the product between the settling time and the smallest/largest achievable noise amplification that scale quadratically with the condition number. This extends the prior work [1], where only the special cases of Polyak's heavy-ball and Nesterov's accelerated algorithms were studied. We also use this geometric characterization to introduce a parameterized family of algorithms that strikes a balance between noise amplification and settling time while preserving order-wise Pareto optimality. |
|---|---|
| ISSN: | 2378-5861 |
| DOI: | 10.23919/ACC55779.2023.10156292 |