A new three-term conjugate gradient algorithm with modified gradient-differences for solving unconstrained optimization problems

Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the exi...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:AIMS mathematics Ročník 8; číslo 2; s. 2473 - 2488
Hlavní autoři: Guo, Jie, Wan, Zhong
Médium: Journal Article
Jazyk:angličtina
Vydáno: AIMS Press 01.01.2023
Témata:
ISSN:2473-6988, 2473-6988
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the existing nonlinear conjugate gradient algorithms, the search directions in this algorithm are always sufficiently descent independent of any line search, as well as having conjugacy property. Using the standard Wolfe line search, global and local convergence of the proposed algorithm is proved under mild assumptions. Implementing the developed algorithm to solve 750 benchmark test problems available in the literature, it is shown that the numerical performance of this algorithm is remarkable, especially in comparison with that of the other similar efficient algorithms.
ISSN:2473-6988
2473-6988
DOI:10.3934/math.2023128