scholarly journals On a New Three-Step Class of Methods and Its Acceleration for Nonlinear Equations

2014 ◽  
Vol 2014 ◽  
pp. 1-9
Author(s):  
T. Lotfi ◽  
K. Mahdiani ◽  
Z. Noori ◽  
F. Khaksar Haghani ◽  
S. Shateyi

A class of derivative-free methods without memory for approximating a simple zero of a nonlinear equation is presented. The proposed class uses four function evaluations per iteration with convergence order eight. Therefore, it is an optimal three-step scheme without memory based on Kung-Traub conjecture. Moreover, the proposed class has an accelerator parameter with the property that it can increase the convergence rate from eight to twelve without any new functional evaluations. Thus, we construct a with memory method that increases considerably efficiency index from81/4≈1.681to121/4≈1.861. Illustrations are also included to support the underlying theory.

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
T. Lotfi ◽  
F. Soleymani ◽  
Z. Noori ◽  
A. Kılıçman ◽  
F. Khaksar Haghani

Two families of derivative-free methods without memory for approximating a simple zero of a nonlinear equation are presented. The proposed schemes have an accelerator parameter with the property that it can increase the convergence rate without any new functional evaluations. In this way, we construct a method with memory that increases considerably efficiency index from81/4≈1.681to121/4≈1.861. Numerical examples and comparison with the existing methods are included to confirm theoretical results and high computational efficiency.


2018 ◽  
Vol 2018 ◽  
pp. 1-12 ◽  
Author(s):  
Alicia Cordero ◽  
Moin-ud-Din Junjua ◽  
Juan R. Torregrosa ◽  
Nusrat Yasmin ◽  
Fiza Zafar

We construct a family of derivative-free optimal iterative methods without memory to approximate a simple zero of a nonlinear function. Error analysis demonstrates that the without-memory class has eighth-order convergence and is extendable to with-memory class. The extension of new family to the with-memory one is also presented which attains the convergence order 15.5156 and a very high efficiency index 15.51561/4≈1.9847. Some particular schemes of the with-memory family are also described. Numerical examples and some dynamical aspects of the new schemes are given to support theoretical results.


2011 ◽  
Vol 5 (2) ◽  
pp. 298-317 ◽  
Author(s):  
Miodrag Petkovic ◽  
Jovana Dzunic ◽  
Ljiljana Petkovic

An efficient family of two-point derivative free methods with memory for solving nonlinear equations is presented. It is proved that the convergence order of the proposed family is increased from 4 to at least 2 + ?6 ? 4.45, 5, 1/2 (5 + ?33) ? 5.37 and 6, depending on the accelerating technique. The increase of convergence order is attained using a suitable accelerating technique by varying a free parameter in each iteration. The improvement of convergence rate is achieved without any additional function evaluations meaning that the proposed methods with memory are very efficient. Moreover, the presented methods are more efficient than all existing methods known in literature in the class of two-point methods and three-point methods of optimal order eight. Numerical examples and the comparison with the existing two-point methods are included to confirm theoretical results and high computational efficiency. 2010 Mathematics Subject Classification. 65H05


2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Tahereh Eftekhari

Based on iterative method proposed by Basto et al. (2006), we present a new derivative-free iterative method for solving nonlinear equations. The aim of this paper is to develop a new method to find the approximation of the root α of the nonlinear equation f(x)=0. This method has the efficiency index which equals 61/4=1.5651. The benefit of this method is that this method does not need to calculate any derivative. Several examples illustrate that the efficiency of the new method is better than that of previous methods.


2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Rajinder Thukral

A new family of eighth-order derivative-free methods for solving nonlinear equations is presented. It is proved that these methods have the convergence order of eight. These new methods are derivative-free and only use four evaluations of the function per iteration. In fact, we have obtained the optimal order of convergence which supports the Kung and Traub conjecture. Kung and Traub conjectured that the multipoint iteration methods, without memory based onnevaluations could achieve optimal convergence order of . Thus, we present new derivative-free methods which agree with Kung and Traub conjecture for . Numerical comparisons are made to demonstrate the performance of the methods presented.


Mathematics ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 108 ◽  
Author(s):  
Xiaofeng Wang ◽  
Yuxi Tao

A new Newton method with memory is proposed by using a variable self-accelerating parameter. Firstly, a modified Newton method without memory with invariant parameter is constructed for solving nonlinear equations. Substituting the invariant parameter of Newton method without memory by a variable self-accelerating parameter, we obtain a novel Newton method with memory. The convergence order of the new Newton method with memory is 1 + 2 . The acceleration of the convergence rate is attained without any additional function evaluations. The main innovation is that the self-accelerating parameter is constructed by a simple way. Numerical experiments show the presented method has faster convergence speed than existing methods.


2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
J. P. Jaiswal

The present paper is devoted to the improvement of theR-order convergence of with memory derivative free methods presented by Lotfi et al. (2014) without doing any new evaluation. To achieve this aim one more self-accelerating parameter is inserted, which is calculated with the help of Newton’s interpolatory polynomial. First theoretically it is proved that theR-order of convergence of the proposed schemes is increased from 6 to 7 and 12 to 14, respectively, without adding any extra evaluation. Smooth as well as nonsmooth examples are discussed to confirm theoretical result and superiority of the proposed schemes.


2011 ◽  
Vol 2011 ◽  
pp. 1-12 ◽  
Author(s):  
R. Thukral

A new family of eighth-order derivative-free methods for solving nonlinear equations is presented. It is proved that these methods have the convergence order of eight. These new methods are derivative-free and only use four evaluations of the function per iteration. In fact, we have obtained the optimal order of convergence which supports the Kung and Traub conjecture. Kung and Traub conjectured that the multipoint iteration methods, without memory based on evaluations, could achieve optimal convergence order . Thus, we present new derivative-free methods which agree with Kung and Traub conjecture for . Numerical comparisons are made to demonstrate the performance of the methods presented.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
J. P. Jaiswal

It is attempted to present two derivative-free Steffensen-type methods with memory for solving nonlinear equations. By making use of a suitable self-accelerator parameter in the existing optimal fourth- and eighth-order without memory methods, the order of convergence has been increased without any extra function evaluation. Therefore, its efficiency index is also increased, which is the main contribution of this paper. The self-accelerator parameters are estimated using Newton’s interpolation. To show applicability of the proposed methods, some numerical illustrations are presented.


Sign in / Sign up

Export Citation Format

Share Document