Global Convergence of Conjugate Gradient Method in Unconstrained Optimization Problems

: In this study, we propose a new parameter in conjugate gradient method. It is shown that the new method fulﬁlls the suﬃcient descent condition with the strong Wolfe condition when inexact line search has been used. The numerical results of this suggested method also shown that this method outperforms to other standard conjugate gradient method.


Introduction
Regarding the following problem: where f is a continuously differentiable function.The non linear conjugate gradient methods are efficient to solve this problem by iterative method at the (k + 1) iteration by the following iteration form: Where the step length α k > 0 and d k denoted by the search direction: where β k is a scalar conjugacy coefficient , there are some well-known formulas of his scalar such as:Hestenes-Stiefel (HS) [1],Fletcher-Reeves (FR) [2], Polak-Ribière 228 H. Y. Najm, E. T. Hamed and H. I. Ahmed (PR) [3],Conjugate Descent-Fletcher (CD) [4], Liu-Storey (LS) [5] and Dai -Yuan (DY) [6].The convergence behavior of conjugate gradient methods are different.
There are many convergence results with some line search conditions has been widely studied, there method can guarantee the descent property of each direction which provided the step length computed by carrying out a line search and its satisfies the strong Wolfe conditions.

New formula for β k and the algorithm
In this section, a new coefficient of conjugate gradient using Hestenes-Steifel formula in the original numerator , and the denominator is introduced in the following formula: where ) and µ is positive constant.

Algorithm (2.1)
Step 1 : For the initial point Step 2: Set d k = −g k Step 3 : Find α k > 0 satisfying the Strong Wolfe Conditions.

Sufficient Descent Property and Global Convergence Analysis
We make the following basic assumptions on the objective function in order to establish the global convergence results for the new algorithm: Assumption (3.1) (see [14]) Under these assumptions, there exists a constant ε > 0 such that Lemma (3.2) Suppose that Assumption (3.1) holds, let the sequence {x k } generated by the algorithm (2.1) and the step length α k satisfies Wolfe conditions, then Proof: For the initial direction k = 1, since d 0 = −g 0 , then g T 0 ≤ − g 0 2 which satisfied (8).For some k > 1 and by using (3) and ( 5), we get We have [7] we have: where c = 1 + 1 µ + (0.2) µ , and µ > 1.
Theorem (3.3)Consider the iteration method x k+1 = x k + α k d k where d k defined by ( 3),( 5) and suppose that Assumption (3.1) holds.Then the new algorithm either stops at stationary point lim k→∞ inf g k = 0 (3.7)

Numerical Results
In this section, the main idea to report the performance of the new method on a set of test problems of (35) nonlinear unconstrained problems by using Fortran language.These test problems are contributed in CUTE, we can found the details of the test functions, in [8] and [9].For each test function, the number of variables n = 100, 200, ..., 500.In order to evaluate the reliability of the new proposed method.Numerical Results by compare between the new-CG Method and Standard HS Method by depend the following tools, n : dimension of the problem , iter: number of iterations, irs: number of restart , f gcnt :number of function and gradient evaluations., time: total time required to complete the evaluation process.f xnew: the value of function.And gnorm: the minimum gradient values, but we do not give the results of all test function due to page limit, see Table (1).

Table 1 :
Compare Numerical Results for NEW-CG Method and Standard HS-CG Method We have proposed a new β k , also we have provided proof the global convergence.The effectiveness of the new proposed method β new k has the good performance Global Convergence of Conjugate Gradient Method 231 compared with other standard conjugate gradient method depended on the selected list of test functions problems.Comparison in Total for all Function Test of new-CG Method against HS Method for 35 Test Problems: