A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization

This paper proposes a nonmonotone scaled conjugate gradient algorithm for solving large-scale unconstrained optimization problems, which combines the idea of scaled memoryless Broyden-Fletcher-Goldfarb-Shanno preconditioned conjugate gradient method with the nonmonotone technique. An attractive prop...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:International journal of computer mathematics Ročník 95; číslo 11; s. 2212 - 2228
Hlavní autoři: Ou, Yigui, Zhou, Xin
Médium: Journal Article
Jazyk:angličtina
Vydáno: Abingdon Taylor & Francis 02.11.2018
Taylor & Francis Ltd
Témata:
ISSN:0020-7160, 1029-0265
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This paper proposes a nonmonotone scaled conjugate gradient algorithm for solving large-scale unconstrained optimization problems, which combines the idea of scaled memoryless Broyden-Fletcher-Goldfarb-Shanno preconditioned conjugate gradient method with the nonmonotone technique. An attractive property of the proposed method is that the search direction always provides sufficient descent step at each iteration. This property is independent of the line search used. Under appropriate assumptions, the method is proven to possess global convergence for nonconvex smooth functions, and R-linear convergence for strongly convex functions. Preliminary numerical results and related comparisons show the efficiency of the proposed method in practical computation.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0020-7160
1029-0265
DOI:10.1080/00207160.2017.1368498