A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization

The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (SIAM J Optim 15:751–779,  2005 ), whose most interesting feature is the use of randomly sampled gradients instead of subgradients. In this paper, combining the GS technique with the sequ...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Numerical algorithms Ročník 65; číslo 1; s. 1 - 22
Hlavní autoři: Tang, Chun-ming, Liu, Shuai, Jian, Jin-bao, Li, Jian-ling
Médium: Journal Article
Jazyk:angličtina
Vydáno: Boston Springer US 01.01.2014
Springer Nature B.V
Témata:
ISSN:1017-1398, 1572-9265
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (SIAM J Optim 15:751–779,  2005 ), whose most interesting feature is the use of randomly sampled gradients instead of subgradients. In this paper, combining the GS technique with the sequential quadratic programming (SQP) method, we present a feasible SQP-GS algorithm that extends the GS algorithm to nonconvex, nonsmooth constrained optimization. The proposed algorithm generates a sequence of feasible iterates, and guarantees that the objective function is monotonically decreasing. Global convergence is proved in the sense that, with probability one, every cluster point of the iterative sequence is stationary for the improvement function. Finally, some preliminary numerical results show that the proposed algorithm is effective.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1017-1398
1572-9265
DOI:10.1007/s11075-012-9692-5