Global convergence property ofs-dependent GFR conjugate gradient method

1998 ◽  
Vol 43 (23) ◽  
pp. 1959-1965 ◽  
Author(s):  
Changyu Wang ◽  
Yuzhong Zhang
2022 ◽  
Vol 2022 (1) ◽  
Author(s):  
Zabidin Salleh ◽  
Adel Almarashi ◽  
Ahmad Alhawarat

AbstractThe conjugate gradient method can be applied in many fields, such as neural networks, image restoration, machine learning, deep learning, and many others. Polak–Ribiere–Polyak and Hestenses–Stiefel conjugate gradient methods are considered as the most efficient methods to solve nonlinear optimization problems. However, both methods cannot satisfy the descent property or global convergence property for general nonlinear functions. In this paper, we present two new modifications of the PRP method with restart conditions. The proposed conjugate gradient methods satisfy the global convergence property and descent property for general nonlinear functions. The numerical results show that the new modifications are more efficient than recent CG methods in terms of number of iterations, number of function evaluations, number of gradient evaluations, and CPU time.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Junyu Lu ◽  
Yong Li ◽  
Hongtruong Pham

One adaptive choice for the parameter of the Dai–Liao conjugate gradient method is suggested in this paper, which is obtained with modified quasi–Newton equation. So we get a modified Dai–Liao conjugate gradient method. Some interesting features of the proposed method are introduced: (i) The value of parameter t of the modified Dai–Liao conjugate gradient method takes both the gradient and function value information. (ii) We establish the global convergence property of the modified Dai–Liao conjugate gradient method under some suitable assumptions. (iii) Numerical results show that the modified DL method is effective in practical computation and the image restoration problems.


2014 ◽  
Vol 9 (5) ◽  
pp. 999-1015 ◽  
Author(s):  
Ioannis E. Livieris ◽  
Panagiotis Pintelas

2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


Sign in / Sign up

Export Citation Format

Share Document