scholarly journals A modified scaled conjugate gradient method with global convergence for nonconvex functions

2014 ◽  
Vol 21 (3) ◽  
pp. 465-477 ◽  
Author(s):  
Saman Babaie-Kafaki ◽  
Reza Ghanbari
2017 ◽  
Vol 52 (2) ◽  
pp. 361-375 ◽  
Author(s):  
Mohammad Reza Arazm ◽  
◽  
Saman Babaie-Kafaki ◽  
Reza Ghanbari ◽  
◽  
...  

2014 ◽  
Vol 9 (5) ◽  
pp. 999-1015 ◽  
Author(s):  
Ioannis E. Livieris ◽  
Panagiotis Pintelas

2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Bakhtawar Baluch ◽  
Zabidin Salleh ◽  
Ahmad Alhawarat

This paper describes a modified three-term Hestenes–Stiefel (HS) method. The original HS method is the earliest conjugate gradient method. Although the HS method achieves global convergence using an exact line search, this is not guaranteed in the case of an inexact line search. In addition, the HS method does not usually satisfy the descent property. Our modified three-term conjugate gradient method possesses a sufficient descent property regardless of the type of line search and guarantees global convergence using the inexact Wolfe–Powell line search. The numerical efficiency of the modified three-term HS method is checked using 75 standard test functions. It is known that three-term conjugate gradient methods are numerically more efficient than two-term conjugate gradient methods. Importantly, this paper quantifies how much better the three-term performance is compared with two-term methods. Thus, in the numerical results, we compare our new modification with an efficient two-term conjugate gradient method. We also compare our modification with a state-of-the-art three-term HS method. Finally, we conclude that our proposed modification is globally convergent and numerically efficient.


Sign in / Sign up

Export Citation Format

Share Document