Bayesian variable selection in binary quantile regression

2016 ◽  
Vol 118 ◽  
pp. 177-181 ◽  
Author(s):  
Man-Suk Oh ◽  
Eun Sug Park ◽  
Beong-Soo So
2018 ◽  
Vol 38 (6) ◽  
pp. 679-694 ◽  
Author(s):  
Katerina Aristodemou ◽  
Jian He ◽  
Keming Yu

2013 ◽  
Vol 6 (2) ◽  
pp. 261-274 ◽  
Author(s):  
Cathy W. S. Chen ◽  
David B. Dunson ◽  
Craig Reed ◽  
Keming Yu

2019 ◽  
Author(s):  
Sierra Bainter ◽  
Thomas Granville McCauley ◽  
Tor D Wager ◽  
Elizabeth Reynolds Losin

In this paper we address the problem of selecting important predictors from some larger set of candidate predictors. Standard techniques are limited by lack of power and high false positive rates. A Bayesian variable selection approach used widely in biostatistics, stochastic search variable selection, can be used instead to combat these issues by accounting for uncertainty in the other predictors of the model. In this paper we present Bayesian variable selection to aid researchers facing this common scenario, along with an online application (https://ssvsforpsych.shinyapps.io/ssvsforpsych/) to perform the analysis and visualize the results. Using an application to predict pain ratings, we demonstrate how this approach quickly identifies reliable predictors, even when the set of possible predictors is larger than the sample size. This technique is widely applicable to research questions that may be relatively data-rich, but with limited information or theory to guide variable selection.


2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Matthew D. Koslovsky ◽  
Marina Vannucci

An amendment to this paper has been published and can be accessed via the original article.


Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 33
Author(s):  
Edmore Ranganai ◽  
Innocent Mudhombo

The importance of variable selection and regularization procedures in multiple regression analysis cannot be overemphasized. These procedures are adversely affected by predictor space data aberrations as well as outliers in the response space. To counter the latter, robust statistical procedures such as quantile regression which generalizes the well-known least absolute deviation procedure to all quantile levels have been proposed in the literature. Quantile regression is robust to response variable outliers but very susceptible to outliers in the predictor space (high leverage points) which may alter the eigen-structure of the predictor matrix. High leverage points that alter the eigen-structure of the predictor matrix by creating or hiding collinearity are referred to as collinearity influential points. In this paper, we suggest generalizing the penalized weighted least absolute deviation to all quantile levels, i.e., to penalized weighted quantile regression using the RIDGE, LASSO, and elastic net penalties as a remedy against collinearity influential points and high leverage points in general. To maintain robustness, we make use of very robust weights based on the computationally intensive high breakdown minimum covariance determinant. Simulations and applications to well-known data sets from the literature show an improvement in variable selection and regularization due to the robust weighting formulation.


Sign in / Sign up

Export Citation Format

Share Document