Nonlinear conjugate gradient methods for unconstrained set optimization problems whose objective functions have finite cardinality

In this paper, we propose nonlinear conjugate gradient methods for unconstrained set optimization problems in which the objective function is given by a finite number of continuously differentiable vector-valued functions. First, we provide a general algorithm for the conjugate gradient method using...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Optimization Ročník 74; číslo 15; s. 3839 - 3878
Hlavní autoři: Kumar, Krishan, Ghosh, Debdas, Yao, Jen-Chih, Zhao, Xiaopeng
Médium: Journal Article
Jazyk:angličtina
Vydáno: Philadelphia Taylor & Francis 18.11.2025
Taylor & Francis LLC
Témata:
ISSN:0233-1934, 1029-4945
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we propose nonlinear conjugate gradient methods for unconstrained set optimization problems in which the objective function is given by a finite number of continuously differentiable vector-valued functions. First, we provide a general algorithm for the conjugate gradient method using Wolfe line search but without imposing an explicit restriction on the conjugate parameter. Later, we study two variants of the algorithm, namely Fletcher-Reeves and conjugate descent, using two different choices of the conjugate parameter but with the same line search rule. In the general algorithm, the direction of movement at each iterate is identified by finding out a descent direction of a vector optimization problem. This vector optimization problem is identified with the help of the concept of partition set at the current iterate. As this vector optimization problem is different at different iterations, the conventional conjugate gradient method for vector optimization cannot be straightly extended to solve the set optimization problem under consideration. The well-definedness of the methods is provided. Further, we prove the Zoutendijk-type condition, which assists in proving the global convergence of the methods even without a regular condition of the stationary points. No convexity assumption is assumed on the objective function to prove the convergence. Lastly, some numerical examples are illustrated to exhibit the performance of the proposed method. We compare the performance of the proposed conjugate gradient methods with the existing steepest descent method. It is found that the proposed method commonly outperforms the existing steepest descent method for set optimization.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0233-1934
1029-4945
DOI:10.1080/02331934.2024.2390116