A new three-term conjugate gradient algorithm with modified gradient-differences for solving unconstrained optimization problems

Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the exi...

Full description

Saved in:
Bibliographic Details
Published in:AIMS mathematics Vol. 8; no. 2; pp. 2473 - 2488
Main Authors: Guo, Jie, Wan, Zhong
Format: Journal Article
Language:English
Published: AIMS Press 01.01.2023
Subjects:
ISSN:2473-6988, 2473-6988
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the existing nonlinear conjugate gradient algorithms, the search directions in this algorithm are always sufficiently descent independent of any line search, as well as having conjugacy property. Using the standard Wolfe line search, global and local convergence of the proposed algorithm is proved under mild assumptions. Implementing the developed algorithm to solve 750 benchmark test problems available in the literature, it is shown that the numerical performance of this algorithm is remarkable, especially in comparison with that of the other similar efficient algorithms.
ISSN:2473-6988
2473-6988
DOI:10.3934/math.2023128