A new Gibbs sampler for Bayesian lasso

被引:1
|
作者
Alhamzawi, Rahim [1 ]
Taha Mohammad Ali, Haithem [2 ]
机构
[1] Univ Al Qadisiyah, Dept Stat, Al Qadisiyah, Iraq
[2] Nawroz Univ, Coll Comp & Informat Technol, Duhok, Iraq
关键词
Bayesian inference; Gibbs sampler; hierarchical model; inverse Gaussian; lasso; linear regression; VARIABLE SELECTION; ADAPTIVE LASSO; R-PACKAGE; REGRESSION; CONVERGENCE;
D O I
10.1080/03610918.2018.1508699
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Lasso regression, a special case of Bridge regression of a penalty function n-ary sumation |beta j|qwithq = 1, is considered from a Bayesian perspective. Park and Casella (2008) introduced the Bayesian lasso regression, using a conditional Laplace prior distribution represented as a scale mixture of normals with an exponential mixing distribution. Recently, Mallick and Yi (2014) provided a new version of Bayesian lasso regression approach by using a scale mixture of uniform representation of the Laplace distribution with a particular gamma mixing density. In this paper, we propose a new Bayesian lasso regression method by using a scale mixture of truncated normal representation of the Laplace density with exponential mixing densities. The method is illustrated via simulation examples and two real data sets. Results show that the proposed method performs very well. An extension to general models is also discussed.
引用
收藏
页码:1855 / 1871
页数:17
相关论文
共 50 条
  • [41] BAYESIAN-ESTIMATION IN MIXED LINEAR-MODELS USING THE GIBBS SAMPLER
    VANDERMERWE, AJ
    BOTHA, TJ
    SOUTH AFRICAN STATISTICAL JOURNAL, 1993, 27 (02) : 149 - 180
  • [42] The Polya-Gamma Gibbs sampler for Bayesian logistic regression is uniformly ergodic
    Choi, Hee Min
    Hobert, James P.
    ELECTRONIC JOURNAL OF STATISTICS, 2013, 7 : 2054 - 2064
  • [43] Bayesian variable selection for latent class analysis using a collapsed Gibbs sampler
    Arthur White
    Jason Wyse
    Thomas Brendan Murphy
    Statistics and Computing, 2016, 26 : 511 - 527
  • [44] The kernel Gibbs sampler
    Graepel, T
    Herbrich, R
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 13, 2001, 13 : 514 - 520
  • [45] Improving the Gibbs sampler
    Park, Taeyoung
    Lee, Seunghan
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2022, 14 (02)
  • [46] EXPLAINING THE GIBBS SAMPLER
    CASELLA, G
    GEORGE, EI
    AMERICAN STATISTICIAN, 1992, 46 (03): : 167 - 174
  • [47] On reparametrization and the Gibbs sampler
    Roman, Jorge Carlos
    Hobert, James P.
    Presnell, Brett
    STATISTICS & PROBABILITY LETTERS, 2014, 91 : 110 - 116
  • [48] CONVERGENCE ANALYSIS OF THE GIBBS SAMPLER FOR BAYESIAN GENERAL LINEAR MIXED MODELS WITH IMPROPER PRIORS
    Roman, Jorge Carlos
    Hobert, James P.
    ANNALS OF STATISTICS, 2012, 40 (06): : 2823 - 2849
  • [49] Approximate Gibbs sampler for efficient inference of hierarchical Bayesian models for grouped count data
    Yu, Jin-Zhu
    Baroud, Hiba
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2024, 94 (14) : 3043 - 3062
  • [50] Estimating Bayesian Diagnostic Models with Attribute Hierarchies with the Hamiltonian-Gibbs Hybrid Sampler
    Martinez, Alfonso J.
    Templin, Jonathan
    MULTIVARIATE BEHAVIORAL RESEARCH, 2023, 58 (01) : 141 - 142