Stochastic quasi-Newton with line-search regularisation

Automatica ◽  
2021 ◽  
Vol 127 ◽  
pp. 109503
Author(s):  
Adrian G. Wills ◽  
Thomas B. Schön
Keyword(s):  
Author(s):  
Jie Guo ◽  
Zhong Wan

A new spectral three-term conjugate gradient algorithm in virtue of the Quasi-Newton equation is developed for solving large-scale unconstrained optimization problems. It is proved that the search directions in this algorithm always satisfy a sufficiently descent condition independent of any line search. Global convergence is established for general objective functions if the strong Wolfe line search is used. Numerical experiments are employed to show its high numerical performance in solving large-scale optimization problems. Particularly, the developed algorithm is implemented to solve the 100 benchmark test problems from CUTE with different sizes from 1000 to 10,000, in comparison with some similar ones in the literature. The numerical results demonstrate that our algorithm outperforms the state-of-the-art ones in terms of less CPU time, less number of iteration or less number of function evaluation.


2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Pengyuan Li ◽  
Zhan Wang ◽  
Dan Luo ◽  
Hongtruong Pham

The BFGS method is one of the most efficient quasi-Newton methods for solving small- and medium-size unconstrained optimization problems. For the sake of exploring its more interesting properties, a modified two-parameter scaled BFGS method is stated in this paper. The intention of the modified scaled BFGS method is to improve the eigenvalues structure of the BFGS update. In this method, the first two terms and the last term of the standard BFGS update formula are scaled with two different positive parameters, and the new value of yk is given. Meanwhile, Yuan-Wei-Lu line search is also proposed. Under the mentioned line search, the modified two-parameter scaled BFGS method is globally convergent for nonconvex functions. The extensive numerical experiments show that this form of the scaled BFGS method outperforms the standard BFGS method or some similar scaled methods.


2019 ◽  
Vol 2019 ◽  
pp. 1-6 ◽  
Author(s):  
Eman T. Hamed ◽  
Huda I. Ahmed ◽  
Abbas Y. Al-Bayati

In this study, we tend to propose a replacement hybrid algorithmic rule which mixes the search directions like Steepest Descent (SD) and Quasi-Newton (QN). First, we tend to develop a replacement search direction for combined conjugate gradient (CG) and QN strategies. Second, we tend to depict a replacement positive CG methodology that possesses the adequate descent property with sturdy Wolfe line search. We tend to conjointly prove a replacement theorem to make sure global convergence property is underneath some given conditions. Our numerical results show that the new algorithmic rule is powerful as compared to different standard high scale CG strategies.


Author(s):  
Anderas Griewank

Iterative methods for solving a square system of nonlinear equations g(x) = 0 often require that the sum of squares residual γ (x) ≡ ½∥g(x)∥2 be reduced at each step. Since the gradient of γ depends on the Jacobian ∇g, this stabilization strategy is not easily implemented if only approximations Bk to ∇g are available. Therefore most quasi-Newton algorithms either include special updating steps or reset Bk to a divided difference estimate of ∇g whenever no satisfactory progress is made. Here the need for such back-up devices is avoided by a derivative-free line search in the range of g. Assuming that the Bk are generated from an rbitrary B0 by fixed scale updates, we establish superlinear convergence from within any compact level set of γ on which g has a differentiable inverse function g−1.


Author(s):  
Mayank Pareek ◽  
Rupal Vikas Srivastava ◽  
Sara Behdad

Building insulation is considered as a solution to reduce the energy cost for both residential and commercial buildings. However, determining the best combination of insulation materials that result into the lowest total ownership cost is now becoming a bigger challenge. Various factors influence the efficiency of heat transfer within a room including geometry and size of the room, ambient temperature, heat and sink sources presented inside the building, type of insulation materials, etc. The aim of this paper is to develop an optimization-based decision making tool to help house owners select the best combination of given insulation materials considering all these factors. The purpose of design approach adopted in this paper is to minimize total ownership cost while providing the required heating in the building. The SQP, Quasi-Newton, line-search algorithm was used to obtain the optimized thermal conductivity values for the combination of insulation material to be used in the walls, floor, ceiling, window and the door of a room, along with the width of the air gap to be kept. The results help in deciding what combination of insulation material will achieve the required heating for the house owner while keep the total cost incurred to be minimum.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Zhensheng Yu ◽  
Zilun Wang ◽  
Ke Su

In this paper, a double nonmonotone quasi-Newton method is proposed for the nonlinear complementarity problem. By using 3-1 piecewise and 4-1 piecewise nonlinear complementarity functions, the nonlinear complementarity problem is reformulated into a smooth equation. By a double nonmonotone line search, a smooth Broyden-like algorithm is proposed, where a single solution of a smooth equation at each iteration is required with the reduction in the scale of the calculation. Under suitable conditions, the global convergence of the algorithm is proved, and numerical results with some practical applications are given to show the efficiency of the algorithm.


Symmetry ◽  
2020 ◽  
Vol 12 (4) ◽  
pp. 656
Author(s):  
Quan Qu ◽  
Xianfeng Ding ◽  
Xinyi Wang

In this paper, a new nonmonotone adaptive trust region algorithm is proposed for unconstrained optimization by combining a multidimensional filter and the Goldstein-type line search technique. A modified trust region ratio is presented which results in more reasonable consistency between the accurate model and the approximate model. When a trial step is rejected, we use a multidimensional filter to increase the likelihood that the trial step is accepted. If the trial step is still not successful with the filter, a nonmonotone Goldstein-type line search is used in the direction of the rejected trial step. The approximation of the Hessian matrix is updated by the modified Quasi-Newton formula (CBFGS). Under appropriate conditions, the proposed algorithm is globally convergent and superlinearly convergent. The new algorithm shows better performance in terms of the Dolan–Moré performance profile. Numerical results demonstrate the efficiency and robustness of the proposed algorithm for solving unconstrained optimization problems.


2013 ◽  
Vol 2013 ◽  
pp. 1-9
Author(s):  
Zhuqing Gui ◽  
Chunyan Hu ◽  
Zhibin Zhu

Firstly, we give the Karush-Kuhn-Tucker (KKT) optimality condition of primal problem and introduce Jordan algebra simply. On the basis of Jordan algebra, we extend smoothing Fischer-Burmeister (F-B) function to Jordan algebra and make the complementarity condition smoothing. So the first-order optimization condition can be reformed to a nonlinear system. Secondly, we use the mixed line search quasi-Newton method to solve this nonlinear system. Finally, we prove the globally and locally superlinear convergence of the algorithm.


2021 ◽  
Vol 2 (3) ◽  
pp. 1-17
Author(s):  
Jacques SABITI KISETA ◽  
Roger LIENDI AKUMOSO

The conditional, unconditional, or the exact maximum likelihood estimation and the least-squares estimation involve minimizing either the conditional or the unconditional residual sum of squares. The maximum likelihood estimation (MLE) approach and the nonlinear least squares (NLS) procedure involve an iterative search technique for obtaining global rather than local optimal estimates. Several authors have presented brief overviews of algorithms for solving NLS problems. Snezana S. Djordjevic (2019) presented a review of some unconstrained optimization methods based on the line search techniques. Mahaboob et al. (2017) proposed a different approach to estimate nonlinear regression models using numerical methods also based on the line search techniques. Mohammad, Waziri, and Santos (2019) have briefly reviewed methods for solving NLS problems, paying special attention to the structured quasi-Newton methods which are the family of the search line techniques. Ya-Xiang Yuan (2011) reviewed some recent results on numerical methods for nonlinear equations and NLS problems based on online searches and trust regions techniques, particularly on Levenberg-Marquardt type methods, quasi-Newton type methods, and trust regions algorithms. The purpose of this paper is to review some online searches and trust region's more well-known robust numerical optimization algorithms and the most used in practice for the estimation of time series models and other nonlinear regression models. The line searches algorithms considered are: Gradient algorithm, Steepest Descent (SD) algorithm, Newton-Raphson (NR) algorithm, Murray’s algorithm, Quasi-Newton (QN) algorithm, Gauss-Newton (GN) algorithm, Fletcher and Powell algorithm (FP), Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. While the only trust-region algorithm considered is the Levenberg-Marquardt (LM) algorithm. We also give some main advantages and disadvantages of these different algorithms.


Sign in / Sign up

Export Citation Format

Share Document