A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization

The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (SIAM J Optim 15:751–779,  2005 ), whose most interesting feature is the use of randomly sampled gradients instead of subgradients. In this paper, combining the GS technique with the sequ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Numerical algorithms Jg. 65; H. 1; S. 1 - 22
Hauptverfasser: Tang, Chun-ming, Liu, Shuai, Jian, Jin-bao, Li, Jian-ling
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Boston Springer US 01.01.2014
Springer Nature B.V
Schlagworte:
ISSN:1017-1398, 1572-9265
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (SIAM J Optim 15:751–779,  2005 ), whose most interesting feature is the use of randomly sampled gradients instead of subgradients. In this paper, combining the GS technique with the sequential quadratic programming (SQP) method, we present a feasible SQP-GS algorithm that extends the GS algorithm to nonconvex, nonsmooth constrained optimization. The proposed algorithm generates a sequence of feasible iterates, and guarantees that the objective function is monotonically decreasing. Global convergence is proved in the sense that, with probability one, every cluster point of the iterative sequence is stationary for the improvement function. Finally, some preliminary numerical results show that the proposed algorithm is effective.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1017-1398
1572-9265
DOI:10.1007/s11075-012-9692-5