scholarly journals A Conjugate Gradient Method for Unconstrained Optimization Problems

2009 ◽  
Vol 2009 ◽  
pp. 1-14 ◽  
Author(s):  
Gonglin Yuan

A hybrid method combining the FR conjugate gradient method and the WYL conjugate gradient method is proposed for unconstrained optimization problems. The presented method possesses the sufficient descent property under the strong Wolfe-Powell (SWP) line search rule relaxing the parameterσ<1. Under the suitable conditions, the global convergence with the SWP line search rule and the weak Wolfe-Powell (WWP) line search rule is established for nonconvex function. Numerical results show that this method is better than the FR method and the WYL method.

Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 92
Author(s):  
Talat Alkouli ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Puspa Liza Ghazali

In this paper, an efficient modification of nonlinear conjugate gradient method and an associated implementation, based on an exact line search, are proposed and analyzed to solve large-scale unconstrained optimization problems. The method satisfies the sufficient descent property. Furthermore, global convergence result is proved. Computational results for a set of unconstrained optimization test problems, some of them from CUTE library, showed that this new conjugate gradient algorithm seems to converge more stable and outperforms the other similar methods in many situations.   


2019 ◽  
Vol 38 (7) ◽  
pp. 227-231
Author(s):  
Huda Younus Najm ◽  
Eman T. Hamed ◽  
Huda I. Ahmed

In this study, we propose a new parameter in the conjugate gradient method. It is shown that the new method fulfils the sufficient descent condition with the strong Wolfe condition when inexact line search has been used. The numerical results of this suggested method also shown that this method outperforms to other standard conjugate gradient method.


2019 ◽  
Vol 2019 ◽  
pp. 1-9
Author(s):  
Jiankun Liu ◽  
Shouqiang Du

We propose a modified three-term conjugate gradient method with the Armijo line search for solving unconstrained optimization problems. The proposed method possesses the sufficient descent property. Under mild assumptions, the global convergence property of the proposed method with the Armijo line search is proved. Due to simplicity, low storage, and nice convergence properties, the proposed method is used to solve M-tensor systems and a kind of nonsmooth optimization problems with l1-norm. Finally, the given numerical experiments show the efficiency of the proposed method.


Algorithms ◽  
2018 ◽  
Vol 11 (9) ◽  
pp. 133 ◽  
Author(s):  
Xiuyun Zheng ◽  
Jiarong Shi

In this paper, a modification to the Polak–Ribiére–Polyak (PRP) nonlinear conjugate gradient method is presented. The proposed method always generates a sufficient descent direction independent of the accuracy of the line search and the convexity of the objective function. Under appropriate conditions, the modified method is proved to possess global convergence under the Wolfe or Armijo-type line search. Moreover, the proposed methodology is adopted in the Hestenes–Stiefel (HS) and Liu–Storey (LS) methods. Extensive preliminary numerical experiments are used to illustrate the efficiency of the proposed method.


Author(s):  
Ibrahim Abdullahi ◽  
Rohanin Ahmad

In this paper, we propose a new hybrid conjugate gradient method for unconstrained optimization problems. The proposed method comprises of beta (DY), beta (WHY), beta (RAMI)  and beta (New). The beta (New)  was constructed purposely for this proposed hybrid method.The method possesses sufficient descent property irrespective of the line search. Under Strong Wolfe-Powell line search, we proved that the method is globally convergent. Numerical experimentation shows the effectiveness and robustness of the proposed method when compare with some hybrid as well as some modified conjugate gradient methods.


Sign in / Sign up

Export Citation Format

Share Document