scholarly journals Projection method with inertial step for nonlinear equations: Application to signal recovery

2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Abdulkarim Hassan Ibrahim ◽  
Poom Kumam ◽  
Min Sun ◽  
Parin Chaipunya ◽  
Auwal Bala Abubakar

<p style='text-indent:20px;'>In this paper, using the concept of inertial extrapolation, we introduce a globally convergent inertial extrapolation method for solving nonlinear equations with convex constraints for which the underlying mapping is monotone and Lipschitz continuous. The method can be viewed as a combination of the efficient three-term derivative-free method of Gao and He [Calcolo. 55(4), 1-17, 2018] with the inertial extrapolation step. Moreover, the algorithm is designed such that at every iteration, the method is free from derivative evaluations. Under standard assumptions, we establish the global convergence results for the proposed method. Numerical implementations illustrate the performance and advantage of this new method. Moreover, we also extend this method to solve the LASSO problems to decode a sparse signal in compressive sensing. Performance comparisons illustrate the effectiveness and competitiveness of our algorithm.</p>

2021 ◽  
Vol 40 (3) ◽  
pp. 64-75
Author(s):  
Kanikar Muangchoo

In this paper, by combining the Solodov and Svaiter projection technique with the conjugate gradient method for unconstrained optimization proposed by Mohamed et al. (2020), we develop a derivative-free conjugate gradient method to solve nonlinear equations with convex constraints. The proposed method involves a spectral parameter which satisfies the sufficient descent condition. The global convergence is proved under the assumption that the underlying mapping is Lipschitz continuous and satisfies a weaker monotonicity condition. Numerical experiment shows that the proposed method is efficient.


Mathematics ◽  
2019 ◽  
Vol 7 (9) ◽  
pp. 767 ◽  
Author(s):  
Abubakar ◽  
Kumam ◽  
Mohammad ◽  
Awwal

This research paper proposes a derivative-free method for solving systems of nonlinearequations with closed and convex constraints, where the functions under consideration are continuousand monotone. Given an initial iterate, the process first generates a specific direction and then employsa line search strategy along the direction to calculate a new iterate. If the new iterate solves theproblem, the process will stop. Otherwise, the projection of the new iterate onto the closed convex set(constraint set) determines the next iterate. In addition, the direction satisfies the sufficient descentcondition and the global convergence of the method is established under suitable assumptions.Finally, some numerical experiments were presented to show the performance of the proposedmethod in solving nonlinear equations and its application in image recovery problems.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Abdulkarim Hassan Ibrahim ◽  
Poom Kumam ◽  
Auwal Bala Abubakar ◽  
Jamilu Abubakar

AbstractIn recent times, various algorithms have been incorporated with the inertial extrapolation step to speed up the convergence of the sequence generated by these algorithms. As far as we know, very few results exist regarding algorithms of the inertial derivative-free projection method for solving convex constrained monotone nonlinear equations. In this article, the convergence analysis of a derivative-free iterative algorithm (Liu and Feng in Numer. Algorithms 82(1):245–262, 2019) with an inertial extrapolation step for solving large scale convex constrained monotone nonlinear equations is studied. The proposed method generates a sufficient descent direction at each iteration. Under some mild assumptions, the global convergence of the sequence generated by the proposed method is established. Furthermore, some experimental results are presented to support the theoretical analysis of the proposed method.


Mathematics ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 583
Author(s):  
Beny Neta

A new high-order derivative-free method for the solution of a nonlinear equation is developed. The novelty is the use of Traub’s method as a first step. The order is proven and demonstrated. It is also shown that the method has much fewer divergent points and runs faster than an optimal eighth-order derivative-free method.


2013 ◽  
Vol 7 (2) ◽  
pp. 390-403 ◽  
Author(s):  
Janak Sharma ◽  
Himani Arora

We present a derivative free method of fourth order convergence for solving systems of nonlinear equations. The method consists of two steps of which first step is the well-known Traub's method. First-order divided difference operator for functions of several variables and direct computation by Taylor's expansion are used to prove the local convergence order. Computational efficiency of new method in its general form is discussed and is compared with existing methods of similar nature. It is proved that for large systems the new method is more efficient. Some numerical tests are performed to compare proposed method with existing methods and to confirm the theoretical results.


Sign in / Sign up

Export Citation Format

Share Document