New modification of the Hestenes-Stiefel with strong Wolfe line search

2021 ◽  
Author(s):  
Nur Athira Japri ◽  
Srimazzura Basri ◽  
Mustafa Mamat
2014 ◽  
Vol 989-994 ◽  
pp. 1802-1805
Author(s):  
Hong Fang Cui

On the basis of the conjugate gradient method CD, the artile builds a new two-parameter P-NCD projected conjugate gradient method, the article gives two-parameter P-NCD Conjugate Gradient Method drop projection and on the strong Wolfe line search in the principles of the convergence criteria, the new algorithm is applied to estimate the equation with linear constraints in the model instantiated test ,the results shows good results.


2015 ◽  
Vol 9 ◽  
pp. 3105-3117 ◽  
Author(s):  
Norhaslinda Zull ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Zabidin Salleh ◽  
Zahrahtul Amani

Author(s):  
Pro Kaelo ◽  
Sindhu Narayanan ◽  
M.V. Thuto

This article presents a modified quadratic hybridization of the Polak–Ribiere–Polyak and Fletcher–Reeves conjugate gradient method for solving unconstrained optimization problems. Global convergence, with the strong Wolfe line search conditions, of the proposed quadratic hybrid conjugate gradient method is established. We also report some numerical results to show the competitiveness of the new hybrid method.


2019 ◽  
Vol 13 (04) ◽  
pp. 2050081
Author(s):  
Badreddine Sellami ◽  
Mohamed Chiheb Eddine Sellami

In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. we propose a modified Fletcher–Reeves (abbreviated FR) [Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154] conjugate gradient algorithm satisfying a parametrized sufficient descent condition with a parameter [Formula: see text] is proposed. The parameter [Formula: see text] is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel (abbreviated HS) [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952) 409–436] algorithm is obtained, which produces a descent search direction at every iteration that the line search satisfies the Wolfe conditions. Under appropriate conditions, we show that the modified FR method with the strong Wolfe line search is globally convergent of uniformly convex functions. We also present extensive preliminary numerical experiments to show the efficiency of the proposed method.


Author(s):  
Chenna Nasreddine ◽  
Sellami Badreddine ◽  
Belloufi Mohammed

In this paper, we present a new hybrid method to solve a nonlinear unconstrained optimization problem by using conjugate gradient, which is a convex combination of Liu–Storey (LS) conjugate gradient method and Hager–Zhang (HZ) conjugate gradient method. This method possesses the sufficient descent property with Strong Wolfe line search and the global convergence with the strong Wolfe line search. In the end of this paper, we illustrate our method by giving some numerical examples.


2019 ◽  
Vol 14 (1) ◽  
pp. 1-9
Author(s):  
P. Kaelo ◽  
P. Mtagulwa ◽  
M. V. Thuto

Abstract In this paper, we develop a new hybrid conjugate gradient method that inherits the features of the Liu and Storey (LS), Hestenes and Stiefel (HS), Dai and Yuan (DY) and Conjugate Descent (CD) conjugate gradient methods. The new method generates a descent direction independently of any line search and possesses good convergence properties under the strong Wolfe line search conditions. Numerical results show that the proposed method is robust and efficient.


Sign in / Sign up

Export Citation Format

Share Document