scholarly journals A nonmonotone line search method for stochastic optimization problems

Filomat ◽  
2018 ◽  
Vol 32 (19) ◽  
pp. 6799-6807
Author(s):  
Natasa Krejic ◽  
Sanja Loncar

A nonmonotone line search method for solving unconstrained optimization problems with the objective function in the form of mathematical expectation is proposed and analyzed. The method works with approximate values of the objective function obtained with increasing sample sizes and improves accuracy gradually. Nonmonotone rule significantly enlarges the set of admissible search directions and prevents unnecessarily small steps at the beginning of the iterative procedure. The convergence is shown for any search direction that approaches the negative gradient in the limit. The convergence results are obtained in the sense of zero upper density. Initial numerical results confirm theoretical results and show efficiency of the proposed approach.

Author(s):  
Saman Babaie-Kafaki ◽  
Saeed Rezaee

Hybridizing the trust region, line search and simulated annealing methods, we develop a heuristic algorithm for solving unconstrained optimization problems. We make some numerical experiments on a set of CUTEr test problems to investigate efficiency of the suggested algorithm. The results show that the algorithm is practically promising.


2015 ◽  
Vol 9 (7) ◽  
pp. 1371-1391 ◽  
Author(s):  
Nataša Krejić ◽  
Zorana Lužanin ◽  
Filip Nikolovski ◽  
Irena Stojkovska

Symmetry ◽  
2020 ◽  
Vol 12 (4) ◽  
pp. 656
Author(s):  
Quan Qu ◽  
Xianfeng Ding ◽  
Xinyi Wang

In this paper, a new nonmonotone adaptive trust region algorithm is proposed for unconstrained optimization by combining a multidimensional filter and the Goldstein-type line search technique. A modified trust region ratio is presented which results in more reasonable consistency between the accurate model and the approximate model. When a trial step is rejected, we use a multidimensional filter to increase the likelihood that the trial step is accepted. If the trial step is still not successful with the filter, a nonmonotone Goldstein-type line search is used in the direction of the rejected trial step. The approximation of the Hessian matrix is updated by the modified Quasi-Newton formula (CBFGS). Under appropriate conditions, the proposed algorithm is globally convergent and superlinearly convergent. The new algorithm shows better performance in terms of the Dolan–Moré performance profile. Numerical results demonstrate the efficiency and robustness of the proposed algorithm for solving unconstrained optimization problems.


2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Yuan-Yuan Chen ◽  
Shou-Qiang Du

Nonlinear conjugate gradient method is one of the useful methods for unconstrained optimization problems. In this paper, we consider three kinds of nonlinear conjugate gradient methods with Wolfe type line search for unstrained optimization problems. Under some mild assumptions, the global convergence results of the given methods are proposed. The numerical results show that the nonlinear conjugate gradient methods with Wolfe type line search are efficient for some unconstrained optimization problems.


Sign in / Sign up

Export Citation Format

Share Document