A Smoothing Proximal Gradient Algorithm for Nonsmooth Convex Regression with Cardinality Penalty

2020 ◽  
Vol 58 (1) ◽  
pp. 858-883 ◽  
Author(s):  
Wei Bian ◽  
Xiaojun Chen
2019 ◽  
Vol 35 (3) ◽  
pp. 371-378
Author(s):  
PORNTIP PROMSINCHAI ◽  
NARIN PETROT ◽  
◽  
◽  

In this paper, we consider convex constrained optimization problems with composite objective functions over the set of a minimizer of another function. The main aim is to test numerically a new algorithm, namely a stochastic block coordinate proximal-gradient algorithm with penalization, by comparing both the number of iterations and CPU times between this introduced algorithm and the other well-known types of block coordinate descent algorithm for finding solutions of the randomly generated optimization problems with regularization term.


2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Maryam A. Alghamdi ◽  
Mohammad Ali Alghamdi ◽  
Naseer Shahzad ◽  
Hong-Kun Xu

We introduce theQ-lasso which generalizes the well-known lasso of Tibshirani (1996) withQa closed convex subset of a Euclideanm-space for some integerm≥1. This setQcan be interpreted as the set of errors within given tolerance level when linear measurements are taken to recover a signal/image via the lasso. Solutions of theQ-lasso depend on a tuning parameterγ. In this paper, we obtain basic properties of the solutions as a function ofγ. Because of ill posedness, we also applyl1-l2regularization to theQ-lasso. In addition, we discuss iterative methods for solving theQ-lasso which include the proximal-gradient algorithm and the projection-gradient algorithm.


2019 ◽  
Vol 82 (3) ◽  
pp. 891-917 ◽  
Author(s):  
Lorenzo Rosasco ◽  
Silvia Villa ◽  
Bằng Công Vũ

Sign in / Sign up

Export Citation Format

Share Document