scholarly journals Nonlinear Conjugate Gradient Methods with Sufficient Descent Condition for Large-Scale Unconstrained Optimization

2009 ◽  
Vol 2009 ◽  
pp. 1-16 ◽  
Author(s):  
Jianguo Zhang ◽  
Yunhai Xiao ◽  
Zengxin Wei

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.

2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Yuan-Yuan Chen ◽  
Shou-Qiang Du

Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.


Filomat ◽  
2018 ◽  
Vol 32 (6) ◽  
pp. 2173-2191
Author(s):  
Hamid Esmaeili ◽  
Majid Rostami ◽  
Morteza Kimiaei

We present a new spectral conjugate gradient method based on the Dai-Yuan strategy to solve large-scale unconstrained optimization problems with applications to compressive sensing. In our method, the numerator of conjugate gradient parameter is a convex combination from the maximum gradient norm value in some preceding iterates and the current gradient norm value. This combination will try to produce the larger step-size far away from the optimizer and the smaller step-size close to it. In addition, the spectral parameter guarantees the descent property of the new generated direction in each iterate. The global convergence results are established under some standard assumptions. Numerical results are reported which indicate the promising behavior of the new procedure to solve large-scale unconstrained optimization and compressive sensing problems.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Bin Qin

The conjugate gradient method is an efficient method for solving large-scale nonlinear optimization problems. In this paper, we propose a nonlinear conjugate gradient method which can be considered as a hybrid of DL and WYL conjugate gradient methods. The given method possesses the sufficient descent condition under the Wolfe-Powell line search and is globally convergent for general functions. Our numerical results show that the proposed method is very robust and efficient for the test problems.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Minglei Fang ◽  
Min Wang ◽  
Min Sun ◽  
Rong Chen

The nonlinear conjugate gradient algorithms are a very effective way in solving large-scale unconstrained optimization problems. Based on some famous previous conjugate gradient methods, a modified hybrid conjugate gradient method was proposed. The proposed method can generate decent directions at every iteration independent of any line search. Under the Wolfe line search, the proposed method possesses global convergence. Numerical results show that the modified method is efficient and robust.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Mohd Asrul Hery Ibrahim ◽  
Mustafa Mamat ◽  
Wah June Leong

In solving large scale problems, the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Hence, a new hybrid method, known as the BFGS-CG method, has been created based on these properties, combining the search direction between conjugate gradient methods and quasi-Newton methods. In comparison to standard BFGS methods and conjugate gradient methods, the BFGS-CG method shows significant improvement in the total number of iterations and CPU time required to solve large scale unconstrained optimization problems. We also prove that the hybrid method is globally convergent.


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Yuting Chen ◽  
Mingyuan Cao ◽  
Yueting Yang

AbstractIn this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.


Author(s):  
Hawraz N. Jabbar ◽  
Basim A. Hassan

<p>The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly the parameter conjugate is the focus for conjugate gradient methods. The current paper proposes new methods of parameter of conjugate gradient type to solve problems of large-scale unconstrained optimization. A Hessian approximation in a diagonal matrix form on the basis of second and third-order Taylor series expansion was employed in this study. The sufficient descent property for the proposed algorithm are proved. The new method was converged globally. This new algorithm is found to be competitive to the algorithm of fletcher-reeves (FR) in a number of numerical experiments.</p>


Sign in / Sign up

Export Citation Format

Share Document