Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization

An accelerated hybrid conjugate gradient algorithm represents the subject of this paper. The parameter β k is computed as a convex combination of (Hestenes and Stiefel, J Res Nat Bur Stand 49:409–436, 1952 ) and (Dai and Yuan, SIAM J Optim 10:177–182, 1999 ), i.e. . The parameter θ k in the convex c...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Numerical algorithms Ročník 54; číslo 1; s. 23 - 46
Hlavný autor: Andrei, Neculai
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Boston Springer US 01.05.2010
Springer Nature B.V
Predmet:
ISSN:1017-1398, 1572-9265
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:An accelerated hybrid conjugate gradient algorithm represents the subject of this paper. The parameter β k is computed as a convex combination of (Hestenes and Stiefel, J Res Nat Bur Stand 49:409–436, 1952 ) and (Dai and Yuan, SIAM J Optim 10:177–182, 1999 ), i.e. . The parameter θ k in the convex combinaztion is computed in such a way the direction corresponding to the conjugate gradient algorithm is the best direction we know, i.e. the Newton direction, while the pair ( s k , y k ) satisfies the modified secant condition given by Li et al. (J Comput Appl Math 202:523–539, 2007 ) B k  + 1 s k  =  z k , where , , s k  =  x k  + 1  −  x k and y k  =  g k  + 1  −  g k . It is shown that both for uniformly convex functions and for general nonlinear functions the algorithm with strong Wolfe line search is globally convergent. The algorithm uses an acceleration scheme modifying the steplength α k for improving the reduction of the function values along the iterations. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei (Numer Algorithms 47:143–156, 2008 ), in which the pair ( s k , y k ) satisfies the classical secant condition B k  + 1 s k  =  y k , as well as some other conjugate gradient algorithms including Hestenes-Stiefel, Dai-Yuan, Polack-Ribière-Polyak, Liu-Storey, hybrid Dai-Yuan, Gilbert-Nocedal etc. A set of 75 unconstrained optimization problems with 10 different dimensions is being used (Andrei, Adv Model Optim 10:147–161, 2008 ).
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1017-1398
1572-9265
DOI:10.1007/s11075-009-9321-0