Some generalized three-term conjugate gradient methods based on CD approach for unconstrained optimization problems

. In this paper, based on the efficient Conjugate Descent (CD) method, two generalized CD algorithms are proposed to solve the unconstrained optimization problems. These methods are three-term conjugate gradient methods which the generated directions by using the conjugate gradient parameters and independent of the line search satisfy in the sufficient descent condition. Furthermore, under the strong Wolfe line search, the global convergence of the proposed methods is proved. Also, the preliminary numerical results on the CUTEst collection are presented to show effectiveness of our methods.


Introduction
Consider the following unconstrained optimization problem ( ), ∈ ℝ (1) where : ℝ → ℝ is a continuously differentiable function and its gradient ≔ is available. Conjugate Gradient (CG) methods are effective iterative methods for solving (1), especially for largescale problems. The important properties of these methods are the use only first-order derivatives, little storage and computation requirements, and strong local and global convergence properties [1,9,18,22]. Starting from an initial guess 0 ∈ ℝ , the CG methods generate a sequence { } ≥0 as +1 = + (2) where > 0 is step-length and usually obtained using some inexact line search. Furthermore, is the search direction calculated by in which = ( ) and is a scalar. There are many variants of CG methods, which are obtained with different choices for the parameter . The most important CG methods proposed by Fletcher-Reeves (FR) [16], Hestenes-Stiefel (HS) [19], Conjugate Descent (CD) by Fletcher [15], Polak-Ribiere-Polyak (PRP) [22,23], Dai-Yuan (DY) [10] and Hager-Zhang (HZ) [17] are defined by , , 2  f is quadratic and exact line search is used [21], but for general objective function the behaviour of these methods is different.
Generally, in the iterative methods, we need the search direction k d satisfy the descent condition 0, 0.
In practical the step-size k  is determined by inexact line search. Some inexact line search techniques have been provided in [21]. The standard Wolfe conditions are [24] , 12 1. 0 cc    To convergence analysis and numerical implementations of CG methods, the step-size k  is often obtained from the strong Wole line search [25] by .
Furthermore, the generalized Wolfe conditions for 13 1 0 cc    and 4 0 c  are as follows: .
For the first time, the general three-term conjugate gradient (TTCG) methods were proposed by Beale [7] to solve the unconstrained optimization problems. In this approach, the search direction  [2,6,26], the descent and conjugacy properties [4,11] and the sufficient descent and conjugacy properties [13,14]. A comparison between some TTCG methods is reported for solving unconstrained optimization problems, see [5]. In this paper, we introduce two three-term conjugate gradient methods based on CD algorithm. Also, the generated search directions satisfy the sufficient descent property, independent of line search. The global convergence of the new methods is proven for general functions under mild assumptions. Also, numerical experiments confirm that our methods are efficient to solve unconstrained optimization problems in compared to some conjugate gradient method. The structure of this paper is as follows. In Section 2, we propose two generalize of CD algorithm which are TTCG methods. The sufficient descent property of generated directions and the global convergence of the proposed algorithms are established in Section 3. In Section 4, we provide some numerical experiments to demonstrate the efficiency of our methods. Finally, some conclusions are given in Section 5.

Motivation and the new algorithms
In this section, we introduce two three-term conjugate gradient algorithms to solve unconstrained optimization problem (1) based on CD method. Fletcher in [15] proposed the CD conjugate gradient method which is closely related to the FR method. Note that to obtain the step-length k  ,we should solve the following one-dimensional optimization problem 0 arg min ( ).
On the other hand, the generated directions by CD method satisfy the sufficient descent condition with strong Wolfe line search [18]. Also, from the generalized Wolfe condition with 3 1 c  and 4 0 c = , we obtain 0 CD FR kk  . Hence, the global convergence of CD method will be obtained by Theorem 2.2 in [1]. Now, we generalize the CD method to obtain a new three-term conjugate gradient method (NTTCD) where the direction dk is calculated by where the parameter k  is to grantee the sufficient descent condition and defined by 1 11 .
(17) ISSN: 2668-778X www.techniumscience.com modification of this method. Hence, we get MNTTCD method while the search direction is generated by  are constant. Note that for 0 k t = and 1 k t = the MNTTCD method reduces to CD and NTTCD methods, respectively. Now, we present the structure of new three-term conjugate gradient algorithms as follows:

Algorithm 1: The new three-term conjugate gradient method (NTTCD)
Step 0: Choose positive constant 12 ,0 1 cc     and an initial point Step 1: Terminate the algorithm once k g   holds.
Step 3: Generate the new iterate by Step 4: Calculate 1 k g + and the conjugate parameter 1 CD k  + by (4).
Step 5: Obtain the parameter

Algorithm 2: The modification of new three-term conjugate gradient method (MNTTCD)
Step 0: Choose positive constant Step 1: Terminate the algorithm once k g   holds.
Step 3: Generate the new iterate by Step 4: Calculate

Convergence analysis
In this section, the sufficient descent property and the global convergence of the new algorithms are established. To this aim, we make some assumptions on the objective function as follows: x (4) and (17), we obtain 22 1 Therefore, the proof is complete.
Using (19), there are two choices for parameter k t .

 
Therefore, by this contradicts, the proof is complete.

Numerical experiments
In this section, we express numerical results on a set of some nonlinear unconstrained optimization test functions on the CUTEst collection [8] which are given in Table 1. The dimensions of test functions are from 2 to 12005 while the initial points are standard ones proposed in CUTEst. We apply the following algorithms to solve these test functions:  Here, we use the performance profiles of Dolan and More [12] to compare the performance of the algorithms on the test functions. We consider P as designates the percentage of problems which are solved within a factor  of the best solver. The horizontal axis of the figure gives the percentage of the test functions for which a method is the fastest (efficiency), while the vertical axis gives the percentage of the test functions that were successfully solved by each method (robustness). Figures 1-3 show the performance of all algorithms to solve the unconstrained optimization problems. In these figures, () P  is designates the percentage of problems which are solved within a factor  of the best solver. Figure 1 shows that the MNTTCD method wins about 32% of test problems with the smallest number of iterations. We conclude from Figure 2 that the NTTCD method is the most effective for most test functions in total number of function evaluations about 39%. From figure 3, we can see that NTTCD method is better than other methods about 26% of the most wins in terms of CPU times.

Conclusion
In this work, we propose two three-term conjugate gradient directions based on CD conjugate gradient method. It is shown that the proposed directions always fulfil the sufficient descent property, independent of the line search. Under standard assumptions, we prove the convergence properties of the new schemes. The preliminary numerical experiment on a set of the test functions collection indicates that the new algorithms are effective.