scholarly journals A Projection Hestenes–Stiefel Method with Spectral Parameter for Nonlinear Monotone Equations and Signal Processing

2020 ◽  
Vol 25 (2) ◽  
pp. 27
Author(s):  
Aliyu Muhammed Awwal ◽  
Lin Wang ◽  
Poom Kumam ◽  
Hassan Mohammad ◽  
Wiboonsak Watthayu

A number of practical problems in science and engineering can be converted into a system of nonlinear equations and therefore, it is imperative to develop efficient methods for solving such equations. Due to their nice convergence properties and low storage requirements, conjugate gradient methods are considered among the most efficient for solving large-scale nonlinear equations. In this paper, a modified conjugate gradient method is proposed based on a projection technique and a suitable line search strategy. The proposed method is matrix-free and its sequence of search directions satisfies sufficient descent condition. Under the assumption that the underlying function is monotone and Lipschitzian continuous, the global convergence of the proposed method is established. The method is applied to solve some benchmark monotone nonlinear equations and also extended to solve ℓ 1 -norm regularized problems to reconstruct a sparse signal in compressive sensing. Numerical comparison with some existing methods shows that the proposed method is competitive, efficient and promising.

2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
San-Yang Liu ◽  
Yuan-Yuan Huang ◽  
Hong-Wei Jiao

Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations.


2018 ◽  
Vol 7 (4.30) ◽  
pp. 458
Author(s):  
Srimazzura Basri ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Non-linear conjugate gradient methods has been widely used instrumental in solving large scale optimization. These methods has been proved that only required very low memory other than its numerical efficiency. Thus, many studies have been conducted to improve these methods to find the most efficient method. In this paper, we proposed a new non-linear conjugate gradient coefficient that guarantees sufficient descent condition. Numerical tests indicate that the proposed coefficient is better than the three classical conjugate gradient coefficients.


2009 ◽  
Vol 2009 ◽  
pp. 1-16 ◽  
Author(s):  
Jianguo Zhang ◽  
Yunhai Xiao ◽  
Zengxin Wei

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.


2017 ◽  
Vol 6 (4) ◽  
pp. 147 ◽  
Author(s):  
Abubakar Sani Halilu ◽  
H. Abdullahi ◽  
Mohammed Yusuf Waziri

A variant method for solving system of nonlinear equations is presented. This method use the special form of iteration with two step length parameters, we suggest a derivative-free method without computing the Jacobian via acceleration parameter as well as inexact line search procedure. The proposed method is proven to be globally convergent under mild condition. The preliminary numerical comparison reported in this paper using a large scale benchmark test problems show that the proposed method is practically quite effective.


2020 ◽  
Vol 30 (4) ◽  
pp. 399-412
Author(s):  
Abubakar Halilu ◽  
Mohammed Waziri ◽  
Ibrahim Yusuf

We proposed a matrix-free direction with an inexact line search technique to solve system of nonlinear equations by using double direction approach. In this article, we approximated the Jacobian matrix by appropriately constructed matrix-free method via acceleration parameter. The global convergence of our method is established under mild conditions. Numerical comparisons reported in this paper are based on a set of large-scale test problems and show that the proposed method is efficient for large-scale problems.


2019 ◽  
Vol 24 (4) ◽  
pp. 550-563
Author(s):  
Mompati Koorapetse ◽  
Professor Kaelo

A new three-term conjugate gradient-based projection method is presented in this paper for solving large-scale nonlinear monotone equations. This method is derivative-free and it is suitable for solving large-scale nonlinear monotone equations due to its lower storage requirements. The method satisfies the sufficient descent condition FTkdk ≤ −τ‖Fk‖2, where τ > 0 is a constant, and its global convergence is also established. Numerical results show that the method is efficient and promising.


Author(s):  
Jamilu Sabi'u ◽  
Abdullah Shah

In this article, we proposed two Conjugate Gradient (CG) parameters using the modified Dai-{L}iao condition and the descent three-term CG search direction. Both parameters are incorporated with the projection technique for solving large-scale monotone nonlinear equations. Using the Lipschitz and monotone assumptions, the global convergence of methods has been proved. Finally, numerical results are provided to illustrate the robustness of the proposed methods.


Author(s):  
Mompati Koorapetse ◽  
P Kaelo ◽  
S Kooepile-Reikeletseng

In this paper, a new modified Perry-type derivative-free projection method for solving large-scale nonlinear monotone equations is presented. The method is developed by combining a modified Perry's conjugate gradient method with the hyperplane projection technique. Global convergence and numerical results of the proposed method are established. Preliminary numerical results show that the proposed method is promising and efficient compared to some existing methods in the literature.


Sign in / Sign up

Export Citation Format

Share Document