scholarly journals A Spectral Conjugate Gradient Method with Descent Property

Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 280
Author(s):  
Jinbao Jian ◽  
Lin Yang ◽  
Xianzhen Jiang ◽  
Pengjie Liu ◽  
Meixing Liu

Spectral conjugate gradient method (SCGM) is an important generalization of the conjugate gradient method (CGM), and it is also one of the effective numerical methods for large-scale unconstrained optimization. The designing for the spectral parameter and the conjugate parameter in SCGM is a core work. And the aim of this paper is to propose a new and effective alternative method for these two parameters. First, motivated by the strong Wolfe line search requirement, we design a new spectral parameter. Second, we propose a hybrid conjugate parameter. Such a way for yielding the two parameters can ensure that the search directions always possess descent property without depending on any line search rule. As a result, a new SCGM with the standard Wolfe line search is proposed. Under usual assumptions, the global convergence of the proposed SCGM is proved. Finally, by testing 108 test instances from 2 to 1,000,000 dimensions in the CUTE library and other classic test collections, a large number of numerical experiments, comparing with both SCGMs and CGMs, for the presented SCGM are executed. The detail results and their corresponding performance profiles are reported, which show that the proposed SCGM is effective and promising.

2012 ◽  
Vol 2012 ◽  
pp. 1-10 ◽  
Author(s):  
Liu Jinkui ◽  
Du Xianglin ◽  
Wang Kairong

A mixed spectral CD-DY conjugate descent method for solving unconstrained optimization problems is proposed, which combines the advantages of the spectral conjugate gradient method, the CD method, and the DY method. Under the Wolfe line search, the proposed method can generate a descent direction in each iteration, and the global convergence property can be also guaranteed. Numerical results show that the new method is efficient and stationary compared to the CD (Fletcher 1987) method, the DY (Dai and Yuan 1999) method, and the SFR (Du and Chen 2008) method; so it can be widely used in scientific computation.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


2014 ◽  
Vol 989-994 ◽  
pp. 1802-1805
Author(s):  
Hong Fang Cui

On the basis of the conjugate gradient method CD, the artile builds a new two-parameter P-NCD projected conjugate gradient method, the article gives two-parameter P-NCD Conjugate Gradient Method drop projection and on the strong Wolfe line search in the principles of the convergence criteria, the new algorithm is applied to estimate the equation with linear constraints in the model instantiated test ,the results shows good results.


Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Jinkui Liu ◽  
Youyi Jiang

A new nonlinear spectral conjugate descent method for solving unconstrained optimization problems is proposed on the basis of the CD method and the spectral conjugate gradient method. For any line search, the new method satisfies the sufficient descent conditiongkTdk<−∥gk∥2. Moreover, we prove that the new method is globally convergent under the strong Wolfe line search. The numerical results show that the new method is more effective for the given test problems from the CUTE test problem library (Bongartz et al., 1995) in contrast to the famous CD method, FR method, and PRP method.


Author(s):  
Chenna Nasreddine ◽  
Sellami Badreddine ◽  
Belloufi Mohammed

In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.


Sign in / Sign up

Export Citation Format

Share Document