coefficient shrinkage
Recently Published Documents


TOTAL DOCUMENTS

10
(FIVE YEARS 2)

H-INDEX

3
(FIVE YEARS 0)

2021 ◽  
Vol 9 (1) ◽  
pp. 1045-1060
Author(s):  
Laavanya Mohan, Vijayaragahvan Veeramani

Image denoising is a major tricky in image processing. The main determination is to quash noise from the degraded image while keeping other details of the image unchanged. In recent years, many multi-resolution based approaches have attained great success in image denoising. In a nut shell, the wavelet transform provide an optimal representation of a noisy image, including a signal with information from a limited number of coefficients and noise by all other left over coefficients. The most popular way to eliminate noise is to threshold the noise affected wavelet coefficient. The noise affected wavelet coefficient shrinkage is better, only if the threshold value is properly selected. Therefore, the performance of various wavelet based denoising techniques depends on the estimation of the threshold value. Different techniques are available to find the threshold value. The aim of this study is to discuss denoising schemes based on various wavelet transforms using threshold approach. Hence, this article examines the research article with threshold selection based on spatial adaptivity, sub-band adaptivity and also hybrid methods with multi-resolution wavelet structures.


2019 ◽  
Vol 1 (1) ◽  
pp. 359-383 ◽  
Author(s):  
Frank Emmert-Streib ◽  
Matthias Dehmer

Regression models are a form of supervised learning methods that are important for machine learning, statistics, and general data science. Despite the fact that classical ordinary least squares (OLS) regression models have been known for a long time, in recent years there are many new developments that extend this model significantly. Above all, the least absolute shrinkage and selection operator (LASSO) model gained considerable interest. In this paper, we review general regression models with a focus on the LASSO and extensions thereof, including the adaptive LASSO, elastic net, and group LASSO. We discuss the regularization terms responsible for inducing coefficient shrinkage and variable selection leading to improved performance metrics of these regression models. This makes these modern, computational regression models valuable tools for analyzing high-dimensional problems.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 13781-13790 ◽  
Author(s):  
Shanshan Gao ◽  
Ningning Guo ◽  
Mingli Zhang ◽  
Jing Chi ◽  
Caiming Zhang

Author(s):  
Vesteinn Thorsson ◽  
Michael Hörnquist ◽  
Andrew F Siegel ◽  
Leroy Hood

We examine the application of statistical model selection methods to reverse-engineering the control of galactose utilization in yeast from DNA microarray experiment data. In these experiments, relationships among gene expression values are revealed through modifications of galactose sugar level and genetic perturbations through knockouts. For each gene variable, we select predictors using a variety of methods, taking into account the variance in each measurement. These methods include maximization of log-likelihood with Cp, AIC, and BIC penalties, bootstrap and cross-validation error estimation, and coefficient shrinkage via the Lasso.


Sign in / Sign up

Export Citation Format

Share Document