scholarly journals Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems

2017 ◽  
Vol 27 (1) ◽  
pp. 124-145 ◽  
Author(s):  
Bo Wen ◽  
Xiaojun Chen ◽  
Ting Kei Pong
IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 126515-126529
Author(s):  
Xiaoya Zhang ◽  
Roberto Barrio ◽  
M. Angeles Martinez ◽  
Hao Jiang ◽  
Lizhi Cheng

Author(s):  
Yi Zhou ◽  
Zhe Wang ◽  
Kaiyi Ji ◽  
Yingbin Liang ◽  
Vahid Tarokh

Various types of parameter restart schemes have been proposed for proximal gradient algorithm with momentum to facilitate their convergence in convex optimization. However, under parameter restart, the convergence of proximal gradient algorithm with momentum remains obscure in nonconvex optimization. In this paper, we propose a novel proximal gradient algorithm with momentum and parameter restart for solving nonconvex and nonsmooth problems. Our algorithm is designed to 1) allow for adopting flexible parameter restart schemes that cover many existing ones; 2) have a global sub-linear convergence rate in nonconvex and nonsmooth optimization; and 3) have guaranteed convergence to a critical point and have various types of asymptotic convergence rates depending on the parameterization of local geometry in nonconvex and nonsmooth optimization. Numerical experiments demonstrate the convergence and effectiveness of our proposed algorithm.


2019 ◽  
Vol 35 (3) ◽  
pp. 371-378
Author(s):  
PORNTIP PROMSINCHAI ◽  
NARIN PETROT ◽  
◽  
◽  

In this paper, we consider convex constrained optimization problems with composite objective functions over the set of a minimizer of another function. The main aim is to test numerically a new algorithm, namely a stochastic block coordinate proximal-gradient algorithm with penalization, by comparing both the number of iterations and CPU times between this introduced algorithm and the other well-known types of block coordinate descent algorithm for finding solutions of the randomly generated optimization problems with regularization term.


2012 ◽  
Vol 60 (3) ◽  
pp. 481-489 ◽  
Author(s):  
J.M. Łęski ◽  
N. Henzel

Abstract Linear regression analysis has become a fundamental tool in experimental sciences. We propose a new method for parameter estimation in linear models. The ’Generalized Ordered Linear Regression with Regularization’ (GOLRR) uses various loss functions (including the ǫ-insensitive ones), ordered weighted averaging of the residuals, and regularization. The algorithm consists in solving a sequence of weighted quadratic minimization problems where the weights used for the next iteration depend not only on the values but also on the order of the model residuals obtained for the current iteration. Such regression problem may be transformed into the iterative reweighted least squares scenario. The conjugate gradient algorithm is used to minimize the proposed criterion function. Finally, numerical examples are given to demonstrate the validity of the method proposed.


Sign in / Sign up

Export Citation Format

Share Document