A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints

In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ϵ -opt...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computational optimization and applications Ročník 57; číslo 2; s. 307 - 337
Hlavní autoři: Necoara, Ion, Patrascu, Andrei
Médium: Journal Article
Jazyk:angličtina
Vydáno: Boston Springer US 01.03.2014
Springer Nature B.V
Témata:
ISSN:0926-6003, 1573-2894
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ϵ -optimal solution in iterations, where n is the number of blocks. For the class of problems with cheap coordinate derivatives we show that the new method is faster than methods based on full-gradient information. Analysis for the rate of convergence in probability is also provided. For strongly convex functions our method converges linearly. Extensive numerical tests confirm that on very large problems, our method is much more numerically efficient than methods based on full gradient information.
Bibliografie:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0926-6003
1573-2894
DOI:10.1007/s10589-013-9598-8