Metric selection in fast dual forward–backward splitting

The performance of fast forward–backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on wh...

Full description

Saved in:
Bibliographic Details
Published in:Automatica (Oxford) Vol. 62; pp. 1 - 10
Main Authors: Giselsson, Pontus, Boyd, Stephen
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.12.2015
Subjects:
ISSN:0005-1098, 1873-2836
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The performance of fast forward–backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-of-the-art optimization software.
ISSN:0005-1098
1873-2836
DOI:10.1016/j.automatica.2015.09.010