scholarly journals A Globally Convergent Hybrid Conjugate Gradient Method and Its Numerical Behaviors

2013 ◽  
Vol 2013 ◽  
pp. 1-14
Author(s):  
Yuan-Yuan Huang ◽  
San-Yang Liu ◽  
Xue-Wu Du ◽  
Xiao-Liang Dong

We consider a hybrid Dai-Yuan conjugate gradient method. We confirm that its numerical performance can be improved provided that this method uses a practical steplength rule developed by Dong, and the associated convergence is analyzed as well.

Author(s):  
Nur Syarafina Mohamed ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Shazlyn Milleana Shaharudin

One of the popular approaches in modifying the Conjugate Gradient (CG) Method is hybridization. In this paper, a new hybrid CG is introduced and its performance is compared to the classical CG method which are Rivaie-Mustafa-Ismail-Leong (RMIL) and Syarafina-Mustafa-Rivaie (SMR) methods. The proposed hybrid CG is evaluated as a convex combination of RMIL and SMR method. Their performance are analyzed under the exact line search. The comparison performance showed that the hybrid CG is promising and has outperformed the classical CG of RMIL and SMR in terms of the number of iterations and central processing unit per time.


Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


Sign in / Sign up

Export Citation Format

Share Document