Noise amplifiation of momentum-based optimization algorithms
We study momentum-based first-order optimization algorithms in which the iterations utilize information from the two previous steps and are subject to additive white noise. For strongly convex quadratic problems, we utilize Jury stability criterion to provide a novel geometric characterization of li...
Saved in:
| Published in: | Proceedings of the American Control Conference pp. 849 - 854 |
|---|---|
| Main Authors: | , , |
| Format: | Conference Proceeding |
| Language: | English |
| Published: |
American Automatic Control Council
31.05.2023
|
| Subjects: | |
| ISSN: | 2378-5861 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | We study momentum-based first-order optimization algorithms in which the iterations utilize information from the two previous steps and are subject to additive white noise. For strongly convex quadratic problems, we utilize Jury stability criterion to provide a novel geometric characterization of linear convergence and exploit this insight to derive alternative proofs of standard convergence results and identify fundamental performance tradeoffs. We use the steady-state variance of the error in the optimization variable to quantify noise amplification and establish analytical lower bounds on the product between the settling time and the smallest/largest achievable noise amplification that scale quadratically with the condition number. This extends the prior work [1], where only the special cases of Polyak's heavy-ball and Nesterov's accelerated algorithms were studied. We also use this geometric characterization to introduce a parameterized family of algorithms that strikes a balance between noise amplification and settling time while preserving order-wise Pareto optimality. |
|---|---|
| ISSN: | 2378-5861 |
| DOI: | 10.23919/ACC55779.2023.10156292 |