scholarly journals A NEW THREE-TERM CONJUGATE GRADIENT-BASED PROJECTION METHOD FOR SOLVING LARGE-SCALE NONLINEAR MONOTONE EQUATIONS

2019 ◽  
Vol 24 (4) ◽  
pp. 550-563
Author(s):  
Mompati Koorapetse ◽  
Professor Kaelo

A new three-term conjugate gradient-based projection method is presented in this paper for solving large-scale nonlinear monotone equations. This method is derivative-free and it is suitable for solving large-scale nonlinear monotone equations due to its lower storage requirements. The method satisfies the sufficient descent condition FTkdk ≤ −τ‖Fk‖2, where τ > 0 is a constant, and its global convergence is also established. Numerical results show that the method is efficient and promising.

Author(s):  
Mompati Koorapetse ◽  
P Kaelo ◽  
S Kooepile-Reikeletseng

In this paper, a new modified Perry-type derivative-free projection method for solving large-scale nonlinear monotone equations is presented. The method is developed by combining a modified Perry's conjugate gradient method with the hyperplane projection technique. Global convergence and numerical results of the proposed method are established. Preliminary numerical results show that the proposed method is promising and efficient compared to some existing methods in the literature.


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang ◽  
Hong-Wei Jiao

Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations.


2018 ◽  
Vol 7 (4.30) ◽  
pp. 458
Author(s):  
Srimazzura Basri ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Non-linear conjugate gradient methods has been widely used instrumental in solving large scale optimization. These methods has been proved that only required very low memory other than its numerical efficiency. Thus, many studies have been conducted to improve these methods to find the most efficient method. In this paper, we proposed a new non-linear conjugate gradient coefficient that guarantees sufficient descent condition. Numerical tests indicate that the proposed coefficient is better than the three classical conjugate gradient coefficients.


2013 ◽  
Vol 11 (5) ◽  
pp. 2586-2600
Author(s):  
Gonglin Yuan ◽  
Yong Li

At present, the conjugate gradient (CG) method of Hager and Zhang (Hager and Zhang, SIAM Journal on Optimization, 16(2005)) is regarded as one of the most effective CG methods for optimization problems. In order to further study the CG method, we develop the Hager and Zhang's CG method and present two modified CG formulas, where the given formulas possess the value information of not only the gradient but also the function. Moreover, the sufficient descent condition will be holden without any line search. The global convergence is established for nonconvex function under suitable conditions. Numerical results show that the proposed methods are competitive to the normal conjugate gradient method.


Sign in / Sign up

Export Citation Format

Share Document