scholarly journals The Generalized Lasso Problem and Uniqueness

2019 ◽  
Vol 13 (2) ◽  
pp. 2307-2347
Author(s):  
Alnur Ali ◽  
Ryan J. Tibshirani
2011 ◽  
Vol 39 (3) ◽  
pp. 1335-1371 ◽  
Author(s):  
Ryan J. Tibshirani ◽  
Jonathan Taylor

2018 ◽  
Vol 12 (1) ◽  
pp. 1053-1097 ◽  
Author(s):  
Sangwon Hyun ◽  
Max G’Sell ◽  
Ryan J. Tibshirani
Keyword(s):  

Author(s):  
Aaron Berk ◽  
Yaniv Plan ◽  
Özgür Yilmaz

Abstract The use of generalized Lasso is a common technique for recovery of structured high-dimensional signals. There are three common formulations of generalized Lasso; each program has a governing parameter whose optimal value depends on properties of the data. At this optimal value, compressed sensing theory explains why Lasso programs recover structured high-dimensional signals with minimax order-optimal error. Unfortunately in practice, the optimal choice is generally unknown and must be estimated. Thus, we investigate stability of each of the three Lasso programs with respect to its governing parameter. Our goal is to aid the practitioner in answering the following question: given real data, which Lasso program should be used? We take a step towards answering this by analysing the case where the measurement matrix is identity (the so-called proximal denoising setup) and we use $\ell _{1}$ regularization. For each Lasso program, we specify settings in which that program is provably unstable with respect to its governing parameter. We support our analysis with detailed numerical simulations. For example, there are settings where a 0.1% underestimate of a Lasso parameter can increase the error significantly and a 50% underestimate can cause the error to increase by a factor of $10^{9}$.


2017 ◽  
Vol 2017 ◽  
pp. 1-9
Author(s):  
Chen ChunRong ◽  
Chen ShanXiong ◽  
Chen Lin ◽  
Zhu YuChen

In the data mining, the analysis of high-dimensional data is a critical but thorny research topic. The LASSO (least absolute shrinkage and selection operator) algorithm avoids the limitations, which generally employ stepwise regression with information criteria to choose the optimal model, existing in traditional methods. The improved-LARS (Least Angle Regression) algorithm solves the LASSO effectively. This paper presents an improved-LARS algorithm, which is constructed on the basis of multidimensional weight and intends to solve the problems in LASSO. Specifically, in order to distinguish the impact of each variable in the regression, we have separately introduced part of principal component analysis (Part_PCA), Independent Weight evaluation, and CRITIC, into our proposal. We have explored that these methods supported by our proposal change the regression track by weighted every individual, to optimize the approach direction, as well as the approach variable selection. As a consequence, our proposed algorithm can yield better results in the promise direction. Furthermore, we have illustrated the excellent property of LARS algorithm based on multidimensional weight by the Pima Indians Diabetes. The experiment results show an attractive performance improvement resulting from the proposed method, compared with the improved-LARS, when they are subjected to the same threshold value.


2004 ◽  
Vol 15 (1) ◽  
pp. 16-28 ◽  
Author(s):  
V. Roth
Keyword(s):  

2018 ◽  
Vol 40 (12) ◽  
pp. 2992-3006 ◽  
Author(s):  
Shaogang Ren ◽  
Shuai Huang ◽  
Jieping Ye ◽  
Xiaoning Qian

Sign in / Sign up

Export Citation Format

Share Document