PDF

Keywords

reciprocal lasso
SMTN
scale mixture of double Pareto
Hierarchical prior model
Gibbs sampler

Abstract

AbstractThis paper discusses the Bayesian reciprocal lasso (rlasso) regularization method as variable selection procedure that produced the more interpretability model with minimum set off predictor variables in right censored limited response variable. The reciprocal lasso introduced the reciprocal of L1-rorm in the penalty function of the penalized parameter estimates minimization problem. Reciprocal lasso is recently developed as regularization methods that produce a parsimonious regression model. We utilized the scale mixture of double Pareto (SMDP) and the scale mixture of truncated normal (SMTN) that discussed by Mallick et al. (2020) with a modification for (SMTN) in the hierarchical prior model. We used the (SMDP) and the modified (SMTN) in the right censored regression model with real data analysis. The results show that the employed two scale mixture types outperform other common regularization methods.
  PDF