Methods with subgradient locality measures for minimizing nonconvex functions

Author(s):  
Krzysztof C. Kiwiel
Keyword(s):  
2021 ◽  
Vol 5 (3) ◽  
pp. 80
Author(s):  
Hari Mohan Srivastava ◽  
Artion Kashuri ◽  
Pshtiwan Othman Mohammed ◽  
Dumitru Baleanu ◽  
Y. S. Hamed

In this paper, the authors define a new generic class of functions involving a certain modified Fox–Wright function. A useful identity using fractional integrals and this modified Fox–Wright function with two parameters is also found. Applying this as an auxiliary result, we establish some Hermite–Hadamard-type integral inequalities by using the above-mentioned class of functions. Some special cases are derived with relevant details. Moreover, in order to show the efficiency of our main results, an application for error estimation is obtained as well.


Author(s):  
Jose Carrillo ◽  
Shi Jin ◽  
Lei Li ◽  
Yuhua Zhu

We improve recently introduced consensus-based optimization method, proposed in [R. Pinnau, C. Totzeck, O. Tse and S. Martin, Math. Models Methods Appl. Sci., 27(01):183{204, 2017], which is a gradient-free optimization method for general nonconvex functions. We rst replace the isotropic geometric Brownian motion by the component-wise one, thus removing the dimensionality dependence of the drift rate, making the method more competitive for high dimensional optimization problems. Secondly, we utilize the random mini-batch ideas to reduce the computational cost of calculating the weighted average which the individual particles tend to relax toward. For its mean- eld limit{a nonlinear Fokker-Planck equation{we prove, in both time continuous and semi-discrete settings, that the convergence of the method, which is exponential in time, is guaranteed with parameter constraints independent of the dimensionality. We also conduct numerical tests to high dimensional problems to check the success rate of the method.


2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Pengyuan Li ◽  
Zhan Wang ◽  
Dan Luo ◽  
Hongtruong Pham

The BFGS method is one of the most efficient quasi-Newton methods for solving small- and medium-size unconstrained optimization problems. For the sake of exploring its more interesting properties, a modified two-parameter scaled BFGS method is stated in this paper. The intention of the modified scaled BFGS method is to improve the eigenvalues structure of the BFGS update. In this method, the first two terms and the last term of the standard BFGS update formula are scaled with two different positive parameters, and the new value of yk is given. Meanwhile, Yuan-Wei-Lu line search is also proposed. Under the mentioned line search, the modified two-parameter scaled BFGS method is globally convergent for nonconvex functions. The extensive numerical experiments show that this form of the scaled BFGS method outperforms the standard BFGS method or some similar scaled methods.


2020 ◽  
Vol 2020 ◽  
pp. 1-8
Author(s):  
Chengli Wang ◽  
Muhammad Shoaib Saleem ◽  
Hamood Ur Rehman ◽  
Muhammad Imran

The purpose of this paper is to introduce the notion of strongly h,s-nonconvex functions and to present some basic properties of this class of functions. We present Schur inequality, Jensen inequality, Hermite–Hadamard inequality, and weighted version of the Hermite–Hadamard inequality.


Sign in / Sign up

Export Citation Format

Share Document