IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 05 (May. 2014), ||V7|| PP 38-43
www.iosrjen.org
New Homotopy Conjugate Gradient for Unconstrained Optimization using Hestenes- Stiefel and Conjugate Descent Salah Gazi Sharee, Hussein Ageel Khatab Dept. of Mathematics, Faculty of Science, University of Zakho, Kurdistan Region-Iraq Abstract: - In this paper, we suggest a hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , We calculate the parameter đ?›˝đ?‘˜ as aconvex combination of đ?›˝đ??ťđ?‘† (Hestenes Stiefel)[5] and đ?›˝đ??śđ??ˇ (Conjugate descent)[3]. Keywords: - Unconstrained optimization, line search, conjugate gradient method. homotopy formula.
I.
INTRODACTION
n
Suppose that f : R →R is continuously differentiable function whose gradient is denoted by đ?‘”(đ?‘Ľ). Consider the nonlinear following unconstrained optimization problem n Min f(x) x ďƒŽ R (1.1) The iterates for solving (1.1)are given by đ?‘Ľđ?‘˜ +1 = đ?‘Ľđ?‘˜ +âˆ?đ?‘˜ đ?‘‘đ?‘˜ ,k=0,1,‌..,n (1.2) Where âˆ?đ?‘˜ is a positive size obtained by line search and đ?‘‘đ?‘˜ is a search direction. The search direction at the very first iteration is the steepest descent đ?‘‘0 = −đ?‘”0 , the directions along the iterations are computed according to; (1.3) đ?‘‘đ?‘˜ +1 = −đ?‘”đ?‘˜+1 + đ?›˝đ?‘˜ đ?‘‘đ??ž , đ?‘˜â‰Ľ0 where đ?›˝đ?‘˜ ∈ đ?‘… is known as conjugate gradient coefficient and different đ?›˝đ?‘˜ will yield different conjugate gradient methods. Some well known formulas are given as follows: đ?›˝đ?‘˜đ?‘ƒđ?‘… = đ?›˝đ?‘˜đ??šđ?‘… =
đ?‘”đ?‘˜đ?‘‡ (đ?‘” đ?‘˜ − đ?‘” đ?‘˜âˆ’1 )
(1.4)
||đ?‘” đ?‘˜âˆ’1 ||2 đ?‘”đ?‘˜đ?‘‡+1 đ?‘” đ?‘˜+1 ||đ?‘” đ?‘˜ ||2 đ?‘”đ?‘˜đ?‘‡+1 (đ?‘” đ?‘˜+1 − đ?‘” đ?‘˜ ) (đ?‘” đ?‘˜ +1 − đ?‘” đ?‘˜ )đ?‘‡ đ?‘‘ đ?‘˜ đ?‘‡ đ?‘”đ?‘˜+1 đ?‘” đ?‘˜+1 (đ?‘” đ?‘˜+1 − đ?‘” đ?‘˜ )đ?‘‡ đ?‘‘ đ?‘˜ đ?‘”đ?‘˜đ?‘‡+1 đ?‘” đ?‘˜+1 −đ?‘‘ đ?‘˜đ?‘‡ đ?‘” đ?‘˜ đ?‘‡ đ?‘”đ?‘˜+1 (đ?‘” đ?‘˜+1 − đ?‘” đ?‘˜ ) −đ?‘‘ đ?‘˜đ?‘‡ đ?‘” đ?‘˜
(1.5)
đ?›˝đ?‘˜đ??ťđ?‘† =
(1.6)
đ?›˝đ?‘˜đ??ˇđ?‘Œ =
(1.7)
đ?›˝đ?‘˜đ??śđ??ˇ =
(1.8)
đ?›˝đ?‘˜đ??żđ?‘† =
(1.9)
Where đ?‘”đ?‘˜+1 and g k are gradients ∇đ?‘“(đ?‘Ľđ?‘˜+1 ) and ∇đ?‘“(đ?‘Ľđ?‘˜ ) of ∇đ?‘“(đ?‘Ľ) at the point đ?‘Ľđ?‘˜ +1 and đ?‘Ľđ?‘˜ , respectively ,
.
denotes the Euclidian norm of vectors. The line search in the conjugate gradient algorithms often is based on the standard Wolfe conditions: đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC; + đ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; â&#x2C6;&#x2019; đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC; â&#x2030;¤ đ?&#x203A;żđ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; (1.10) đ?&#x2018;&#x201D;(đ?&#x2018;Ľđ?&#x2018;&#x2DC; + đ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; )đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; â&#x2030;Ľ đ?&#x153;&#x17D;đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; (1.11) where 0 <δ<Ď&#x192;< 1. The strong Wolfe line search corresponds to: that đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC; + đ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; â&#x2C6;&#x2019; đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC; â&#x2030;¤ đ?&#x203A;żđ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; (1.12) đ?&#x2018;&#x201D;(đ?&#x2018;Ľđ?&#x2018;&#x2DC; + đ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; )đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; â&#x2030;¤ đ?&#x153;&#x17D;đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; (1.13) where 0 <δ<Ď&#x192;< 1.[10]
II.
HYBRID CONJUGATE GRADIENT ALGORITHMS
The hybrid conjugate gradient algorithms are combinations of different conjugate gradient algorithms. They are mainly purposed in order to avoid the jamming phenomenon.[1],these methods are an important class of conjugate gradient algorithms.[6],[9]
International organization of Scientific Research
38 | P a g e
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent The methods of (FR)[4],(DY) [2] and (CD)[3] have strong convergence properties, but they may have modest practical performance due to jamming. On the other hand, the methods of (PR)[8],(HS)[5] and (LS)[7] may not always be convergent, but the often have better computational performances.[1] III. NEW HYBRID SUGGESTION our suggestion generates iterates đ?&#x2018;Ľ0 , đ?&#x2018;Ľ1 , đ?&#x2018;Ľ2 ,â&#x20AC;Ś computed by means of the recurrence (đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 = đ?&#x2018;Ľđ?&#x2018;&#x2DC; +â&#x2C6;?đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ), where the stepsize â&#x2C6;?đ?&#x2018;&#x2DC; > 0 is determined according to the Wolf line search condition (1.10) and (1.11), and the directions đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; are generated by the rule : đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019;đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; (3.1) Where đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; = 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x203A;˝đ?&#x2018;&#x2DC;đ??ťđ?&#x2018;&#x2020; + đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x203A;˝đ?&#x2018;&#x2DC;đ??śđ??ˇ , 0 â&#x2030;¤ đ?&#x153;&#x192; â&#x2030;¤ 1 đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160;
= 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC;
đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021;+1 đ?&#x2018;Ś đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018; đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Ś đ?&#x2018;&#x2DC;
â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC;
đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC;+1 2
(3.2)
đ?&#x2018;&#x2018; đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC; đ?&#x2018; đ??¸đ?&#x2018;&#x160; đ?&#x203A;˝đ?&#x2018;&#x2DC; = đ?&#x203A;˝đ??ťđ?&#x2018;&#x2020; ,
Observe that , if đ?&#x153;&#x192;đ?&#x2018;&#x2DC; = 0, then if đ?&#x153;&#x192;đ?&#x2018;&#x2DC; = 1, then đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; = đ?&#x203A;˝đ??śđ??ˇ , On the other hand , if 0 < đ?&#x153;&#x192;đ?&#x2018;&#x2DC; < 1, then we can find đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; as follows: We know that đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; +1 = â&#x2C6;&#x2019;đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2021; đ?&#x2018;&#x2018; (3.3) đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC; Our motivation is to choose the parameter đ?&#x153;&#x192;đ?&#x2018;&#x2DC; in such a way so that the direction đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 given (3.3( to be the Newton direction. Therefore đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 â&#x2C6;&#x2019;â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 â&#x2C6;&#x2019;1 đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019;đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018; â&#x2C6;&#x2019; đ?&#x153;&#x192; đ?&#x2018;&#x2018; đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC; Multiply both sides of above equation by đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C;(đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 ) , we get đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; â&#x2C6;&#x2019;đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019;đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C;(đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 )đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Śđ?&#x2018;&#x2DC; â&#x2C6;&#x2019;đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C;(đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 )
đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x2018; đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC;
â&#x2C6;&#x2019;đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019;đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C;(đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 )đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; â&#x2C6;&#x2019;đ?&#x153;&#x192;đ?&#x2018;&#x2DC;
đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x2018; đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC;
đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Śđ?&#x2018;&#x2DC;
(đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC; +1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; )
Since đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; â&#x2C6;&#x2021;2 đ?&#x2018;&#x201C; đ?&#x2018;Ľđ?&#x2018;&#x2DC; +1 = đ?&#x2018;Śđ?&#x2018;&#x2DC; , then we have đ?&#x2018;&#x2021; â&#x2C6;&#x2019;đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019;đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + 1 â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2021; â&#x2C6;&#x2019;đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019;đ?&#x153;&#x192;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; â&#x2C6;&#x2019; đ?&#x153;&#x192;đ?&#x2018;&#x2DC;
đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x2021; đ?&#x2018;Ś đ?&#x2018;&#x2018; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC;
đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x2021; đ?&#x2018;Ś đ?&#x2018;&#x2018; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC;
Implies that đ?&#x153;&#x192;đ?&#x2018;&#x2DC; =
đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x201D; đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; + đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2021; đ?&#x2018;&#x2018; đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC;
Or đ?&#x153;&#x192;đ?&#x2018;&#x2DC; =
đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 (đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; )
(3.4)
3.1. Convergence of the new hybrid conjugate gradient algorithm Theorem 3.1.1 : Assume that dk is a descent direction and â&#x2C6;?đ?&#x2018;&#x2DC; in algorithm (1.2) and (3.2) where θk is given by (2.4) ,is determined by the wolfe line search (1.10)and (1.11). If 0 < đ?&#x153;&#x192; < 1, then the direction dk+1 given by (3.3) is a descent direction. Proof:From (3.3) and (3.4) we have
International organization of Scientific Research
39 | P a g e
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 𝑑𝑘 𝑇 𝑇 2 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘𝑇 𝑦𝑘 𝑇 𝑇 𝑑𝑘 𝑔𝑘+1 (𝑑𝑘 𝑔𝑘 ) 𝑔𝑘+1 2 − 𝑇 𝑇 𝑇 2 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘𝑇 𝑔𝑘 𝑇 Multiply both sides of (3.1.1) by 𝑔𝑘+1 , we get 𝑇 2 𝑔𝑘+1 𝑑𝑘+1 = − 𝑔𝑘+1 𝑇 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 (𝑔𝑘+1 𝑑𝑘 ) + 1− 𝑇 𝑇 𝑇 𝑇 2 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘 𝑦𝑘 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 2 (𝑔𝑘+1 𝑑𝑘 ) − 𝑇 𝑇 𝑇 𝑇 2 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘 𝑔𝑘
𝑑𝑘 +1 = −𝑔𝑘+1 + 1 −
𝑇 𝑔𝑘+1 𝑦𝑘
(3.1.1)
(3.1.2)
𝑇 𝑇 𝑔𝑘+1 𝑦𝑘 (𝑔𝑘+1 𝑑𝑘 ) 𝑇 𝑑𝑘 𝑦𝑘 𝑇 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 (𝑔𝑘+1 𝑑𝑘 ) − 𝑇 𝑇 𝑇 𝑇 2 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘 𝑦𝑘 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 2 (𝑔𝑘+1 𝑑𝑘 ) − (3.1.3) 𝑇 𝑇 𝑇 𝑇 2 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘 𝑔𝑘 The prove is complete if the step length ∝𝑘 is chosen by an exact line search which requires 𝑑𝑘𝑇 𝑔𝑘+1 = 0. Now, if the step length ∝𝑘 is chosen by an inexact line search which requires 𝑑𝑘𝑇 𝑔𝑘+1 ≠ 0, We know that the first two terms of equation (3.1.3) are less than or equal to zero because the algorithm of Hestenes – Stiefel (HS) is satisfies the descent condition (i.e) 𝑇 𝑔𝑘+1 𝑑𝑘+1
2
= − 𝑔𝑘+1
- ||𝑔𝑘+1 ||2 +
+
𝑇 𝑔𝑘𝑇+1 𝑦 𝑘 (𝑔𝑘+1 𝑑𝑘 )
𝑑 𝑘𝑇 𝑦 𝑘
≤ 0,
(3.1.4)
It remains to consider the third and fourth terms 𝑇 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 (𝑔𝑘+1 𝑑𝑘 ) − 𝑇 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘𝑇 𝑔𝑘 ) + 𝑔𝑘+1 2 (𝑦𝑘𝑇 𝑑𝑘 ) 𝑑𝑘𝑇 𝑦𝑘 −
−
𝑇 𝑑𝑘𝑇 𝑔𝑘+1 2 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 𝑇 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘𝑇 𝑔𝑘 ) + 𝑔𝑘+1
− 𝑑𝑘𝑇 𝑔𝑘+1 𝑑𝑘𝑇 𝑦𝑘 =
− 𝑑𝑘𝑇 𝑔𝑘+1 𝑑𝑘𝑇 𝑦𝑘
2 (𝑦 𝑇 𝑑 ) 𝑘 𝑘
𝑑𝑘𝑇 𝑦𝑘
𝑇 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 𝑇 𝑇 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘+1
2
−
2 (𝑦 𝑇 𝑑 ) 𝑘 𝑘
𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑇 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘𝑇 𝑔𝑘 ) + 𝑔𝑘+1
𝑑𝑘𝑇 𝑔𝑘+1 2 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑇 𝑔𝑘+1 𝑦𝑘 𝑑𝑘𝑇 𝑔𝑘
+
2
+ 𝑔𝑘+1
2
𝑇 (𝑔𝑘+1 𝑑𝑘 )
2 (𝑦 𝑇 𝑑 ) 𝑘 𝑘 𝑔𝑘+1 2 2
𝑑𝑘𝑇 𝑔𝑘
=
𝑦𝑘𝑇 𝑑𝑘 (𝑑𝑘𝑇 𝑔𝑘 )
(𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 2 𝑇 𝑔𝑘+1 𝑦𝑘 𝑑𝑘𝑇 𝑔𝑘 2 + 𝑔𝑘+1 𝑑𝑘𝑇 𝑦𝑘
=
2 (𝑑𝑇 𝑔 ) 𝑘 𝑘
2
(3.1.5) 2
We know that dTk gk+1 is greater than or equal to zero and dTk yk > 0. Consequently, we have 2 − dTk gk+1 ≤0 dTk yk Implies that g Tk+1 dk+1 ≤ 0. Then the proof is completed.∎ Theorem 3.1.2 :- Assume that the conditions in theorem (3.1.1) hold and
𝑦𝑘𝑇 𝑔 𝑘+1 (𝑑 𝑘𝑇 𝑔 𝑘+1 ) 𝑦𝑘𝑇 𝑑 𝑘
≤ 𝑔𝑘+1
2
. If there
exists a constant 𝑐1 > 0, such that g Tk+1 dk+1 ≤ −𝑐1 𝑔𝑘+1 2 , then the direction dk+1 satisfies the sufficient descent condition. Proof: From (3.3) and (3.4) we have 𝑑𝑘 +1 𝑇 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑦𝑘 = −𝑔𝑘+1 + 1 − 𝑇 𝑑𝑘 𝑇 𝑇 𝑇 2 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘 𝑔𝑘 ) + 𝑔𝑘 (𝑦𝑘 𝑑𝑘 ) 𝑑𝑘 𝑦𝑘 𝑑𝑘𝑇 𝑔𝑘+1 (𝑑𝑘𝑇 𝑔𝑘 ) 𝑔𝑘+1 2 − 𝑑 (4.2.1) 𝑇 𝑔𝑘+1 𝑦𝑘 (𝑑𝑘𝑇 𝑔𝑘 ) + 𝑔𝑘 2 (𝑦𝑘𝑇 𝑑𝑘 ) 𝑑𝑘𝑇 𝑔𝑘 𝑘
International organization of Scientific Research
40 | P a g e
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent đ?&#x2018;&#x2021; Multiply both sides of (4.2.1) by đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 we get
đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 = â&#x2C6;&#x2019; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1
â&#x2C6;&#x2019;
đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 â&#x2030;¤ â&#x2C6;&#x2019; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1
â&#x2C6;&#x2019;
đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 â&#x2030;¤
đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; 2 đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 (đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 (đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;
2
+ 1â&#x2C6;&#x2019;
đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; 2 đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 (đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 (đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;
2
â&#x2C6;&#x2019; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Śđ?&#x2018;&#x2DC;
+ đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1
2
2
â&#x2C6;&#x2019;
đ?&#x2018;&#x2021; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2021; đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1
2 (đ?&#x2018;Ś đ?&#x2018;&#x2021; đ?&#x2018;&#x2018; ) đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC;
+
(đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; ) đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; 2 + đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Śđ?&#x2018;&#x2DC;
After some operations, we get â&#x2C6;&#x2019;(đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 )(đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 â&#x2030;¤ (đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;Śđ?&#x2018;&#x2DC; ) Multiply and divided right hand side of above inequality by đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 , we get â&#x2C6;&#x2019;(dTk g k+1 )(dTk g k+1 )( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ) g Tk+1 dk+1 â&#x2030;¤ ( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 )(dTk yk ) By hypothesis , (3.1.9) gives â&#x2C6;&#x2019;(đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 â&#x2030;¤ đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 ( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ) Multiply and divided right hand side by đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;¤đ?&#x2018;&#x2019; đ?&#x2018;&#x201D;đ?&#x2018;&#x2019;đ?&#x2018;Ą â&#x2C6;&#x2019;(đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 )( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; +1 â&#x2030;¤ đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 )2 Since dTk g k+1 < dTk yk , then (dT yk )( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 ) đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; +1 â&#x2030;¤ â&#x2C6;&#x2019; k đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 2 ( đ?&#x2018;Śđ?&#x2018;&#x2DC; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 )2 Now, if đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 > 0, Let đ?&#x2018;?1 =
(3.1.6)
(3.1.7)
2 (đ?&#x2018;&#x2018;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D; ) đ?&#x2018;&#x2DC; đ?&#x2018;&#x2DC;
(3.1.8)
(3.1.9)
2
(3.1.10)
đ?&#x2018;&#x2021; (d T k y k )( đ?&#x2018;Ś đ?&#x2018;&#x2DC; đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC; +1 )
( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D; đ?&#x2018;&#x2DC;+1 )2
Then ,( 3.1.10) gives g Tk+1 dk+1 â&#x2030;¤ â&#x2C6;&#x2019;đ?&#x2018;?1 g k+1
2
If đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 < 0 and we know that dTk yk > 0 , then, (dTk yk ) đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 < dTk yk , Then, (3.1.10) gives đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 â&#x2030;¤ â&#x2C6;&#x2019; dT yk
Let đ?&#x2018;?1 = ( đ?&#x2018;Ś đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;k đ?&#x2018;&#x2DC;
dTk yk đ?&#x2018;&#x201D; ( đ?&#x2018;Śđ?&#x2018;&#x2DC;đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 )2 đ?&#x2018;&#x2DC;+1
2
2 đ?&#x2018;&#x2DC; +1 )
Hence đ?&#x2018;&#x2021; đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; +1 â&#x2030;¤ â&#x2C6;&#x2019;đ?&#x2018;?1 đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1
2
Then the proof is completed.â&#x2C6;&#x17D; 3.2 Theorem of global convergence Since the new hybrid conjugate gradient algorithm is satisfies the sufficient descent condition by using wolfe conditions , then the new hybrid conjugate gradient algorithm is satisfies the global convergence property .
International organization of Scientific Research
41 | P a g e
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent
3.3 Algorithm of New Hybrid Conjugate Gradient algorithm step (1) :- set k=0 , select the initial point đ?&#x2018;Ľđ?&#x2018;&#x2DC; . step( 2) :- đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; =
f(đ?&#x2018;Ľđ?&#x2018;&#x2DC; ) , If đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; = 0 ,then stop .
else set đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; = - đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC; . step (3) :- compute đ?&#x203A;źđ?&#x2018;&#x2DC; > 0 satisfying the wolfe lline search condition to mini step (4) :- đ?&#x2018;Ľđ?&#x2018;&#x2DC; +1 = đ?&#x2018;Ľđ?&#x2018;&#x2DC; + đ?&#x203A;źđ?&#x2018;&#x2DC; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; . step (5) :- đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 =
mize f(xk+1).
f(đ?&#x2018;Ľđ?&#x2018;&#x2DC;+1 ) , If đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 = 0 ,then stop .
step (6):- compute θk as in (3.4). Step (7):-if 0 < đ?&#x153;&#x192;đ?&#x2018;&#x2DC; < 1, then compute đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; as in (3.2). If đ?&#x153;&#x192;đ?&#x2018;&#x2DC; â&#x2030;Ľ 1, then set đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; = đ?&#x203A;˝đ??śđ??ˇ . If đ?&#x153;&#x192;đ?&#x2018;&#x2DC; â&#x2030;¤ 0, then set đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; = đ?&#x203A;˝đ??ťđ?&#x2018;&#x2020; . step (8) :- đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC;+1 = - đ?&#x2018;&#x201D;đ?&#x2018;&#x2DC;+1 + đ?&#x203A;˝đ?&#x2018;&#x2DC;đ?&#x2018; đ??¸đ?&#x2018;&#x160; đ?&#x2018;&#x2018;đ?&#x2018;&#x2DC; . step (9) :- If k=n then go to step 2, else k=k+1 and go to step 3. 3.4 NUMRICAL RESULTS:This section is devoted to test the implementation of the new formula . We compare the hybrid algorithm with standard Hestenes â&#x20AC;&#x201C; Stiefel (HS) and conjugate direction (CD) ,the comparative tests involve well-known nonlinear problems (standard test function) with different dimension 4 â&#x2030;¤nâ&#x2030;¤5000, all programs are written in
ď&#x201A;Ł 10ď&#x20AC;5
g
k ď&#x20AC;Ť1 ď&#x201A;Ľ FORTRAN95 language and for all cases the stopping condition is The results are given in below table is specifically quote the number of functions NOF and the number of iteration NOI .experimental results in below table confirm that the new CG method is superior to standard CG method with respect to the NOI and NOF.
Table Comparative Performance of the three algorithms (Standard HS,CD and New formula) Test fun. N Standard formula Standard formula (CD) New formula ( HS)
Powell
Wood
Cubic
Rosen
4 100 500 1000 3000 5000 4 100 500 1000 3000 5000 4 100 500 1000 3000 5000 4 100 500 1000 3000 5000
NOI
NOF
NOI
NOF
NOI
NOF
65 105 502 637 879 1008 26 27 28 28 28 28 15 14 14 14 14 14 23 17 * * * *
170 276 1062 1332 1816 2074 59 61 63 63 63 63 43 40 40 40 40 40 66 52 * * * *
994 3102 134002 * * * 353 928 1008 561 500 517 540 801 1501 * * * 611 461 2063 4036 * *
2077 6942 270117 * * * 709 2115 2277 1165 1038 1111 1104 2060 5287 * * * 1262 1374 6612 13523 * *
30 108 502 241 249 409 26 26 27 27 27 27 11 11 11 11 11 11 23 21 21 21 21 21
74 242 1011 532 568 913 61 61 6 63 63 63 34 32 32 32 32 32 64 62 62 62 62 62
International organization of Scientific Research
42 | P a g e
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent Mile
Non Digonal
Woolf
4 100 500 1000 3000 5000 4 100 500 1000 3000 5000 4 100 500 1000 3000 5000
28 142 501 1001 1442 1660 23 22 22 22 22 22 12 49 56 70 166 176
101 346 1108 2312 3252 3688 61 60 59 59 59 59 27 99 113 141 343 365
IV.
* * * * * * 249 * * * * * 49 901 3893 8001 * *
* * * * * * 558 * * * * * 99 1879 8393 19675 * *
18 149 501 998 1270 1418 23 18 18 18 19 19 12 49 55 69 166 175
50 355 1092 2290 2834 3130 59 51 53 53 55 55 27 99 111 139 343 363
CONCLUSION
In this paper we have presented a new hybrid conjugate gradient method in which a famous parameter đ?&#x203A;˝đ?&#x2018;&#x2DC; is computed as a convex combination of đ?&#x203A;˝đ?&#x2018;&#x2DC;đ??ťđ?&#x2018;&#x2020; and đ?&#x203A;˝đ?&#x2018;&#x2DC;đ??śđ??ˇ and comparative numerical performances of a number of well known conjugate gradient algorithms Hestenes Stiefel (HS) and Conjugate decsent (CD). We saw that the performance profile of our method was higher than those of the well established conjugate gradient algorithms HS and CD.
REFRENCES [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
N. Andrei, Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization , J.Optim. Theory Appl., 141, 2009, 249-264. Y.H. Dai, and Y. Yuan, A nonlinear Conjugate Gradient Method With a Strong Global Convergence Property, SIAM J. Optimization , 10,1999, Pp. 177-182. R. Fletcher, Practical Methods of Optimization,(Vol. 1: Unconstrined Optimization , John Wiley and Sons, New York, 1987). R. Fletcher, and C. Reeves, Function Minimization by Conjugate Gradients, Comput. J.,7, 1964 , Pp. 149-154. M.R. Hestenes, and E.L. Stiefel, Methods of Conjugate Gradients for Solving Linear Systems, J.Research Nat. Bur. Standards ,49 ,1952, Pp. 409-436. Y.F. Hu and C. Storey, Global Convergence Result for Conjugate Gradient Methods, J. Optim. , Theory Appl., 71,1991, Pp. 399-405. Y. Liu , and C. Storey, Efficient generalized Conjugate Gradient Algorithms , Part 1: Theory. JOTA,69,1991, Pp.129-137. E. Polak , andG. Ribiere, Note sur Ia convergence de directions Conjugate , Rev. Francaise Information, Recherche Operationelle, 3e Annee, 16,1969, Pp. 35-43. D. Touati-Ahmed and C. Storey, Efficient Hybrid Conjugate Gradient Techniques , J. Optim. , Theory Appl., 64,1990, Pp.379-397. P. Wolfe, Convergence conditions for ascent methods, SIAM Rev., 11,1969, 226-235.
International organization of Scientific Research
43 | P a g e