Metric selection in fast dual forward–backward splitting

The performance of fast forward–backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on wh...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Automatica (Oxford) Ročník 62; s. 1 - 10
Hlavní autoři: Giselsson, Pontus, Boyd, Stephen
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 01.12.2015
Témata:
ISSN:0005-1098, 1873-2836
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The performance of fast forward–backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-of-the-art optimization software.
ISSN:0005-1098
1873-2836
DOI:10.1016/j.automatica.2015.09.010