A fully Bayesian approach to sparse reduced-rank multivariate regression

被引:1
|
作者
Yang, Dunfu [1 ]
Goh, Gyuhyeong [1 ]
Wang, Haiyan [1 ]
机构
[1] Kansas State Univ, Dept Stat, 101 Dickens Hall,1116 Midcampus Dr N, Manhattan, KS 66506 USA
关键词
bayesian reduced-rank regression; fully Bayesian inference; high-dimensional variable selection; low-rank matrix estimation; multivariate linear regression; SIMULTANEOUS DIMENSION REDUCTION; VARIABLE SELECTION; MODEL CHOICE; APPROXIMATIONS;
D O I
10.1177/1471082X20948697
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In the context of high-dimensional multivariate linear regression, sparse reduced-rank regression (SRRR) provides a way to handle both variable selection and low-rank estimation problems. Although there has been extensive research on SRRR, statistical inference procedures that deal with the uncertainty due to variable selection and rank reduction are still limited. To fill this research gap, we develop a fully Bayesian approach to SRRR. A major difficulty that occurs in a fully Bayesian framework is that the dimension of parameter space varies with the selected variables and the reduced-rank. Due to the varying-dimensional problems, traditional Markov chain Monte Carlo (MCMC) methods such as Gibbs sampler and Metropolis-Hastings algorithm are inapplicable in our Bayesian framework. To address this issue, we propose a new posterior computation procedure based on the Laplace approximation within the collapsed Gibbs sampler. A key feature of our fully Bayesian method is that the model uncertainty is automatically integrated out by the proposed MCMC computation. The proposed method is examined via simulation study and real data analysis.
引用
收藏
页码:199 / 220
页数:22
相关论文
共 50 条
  • [31] Discovering genetic associations with high-dimensional neuroimaging phenotypes: A sparse reduced-rank regression approach
    Vounou, Maria
    Nichols, Thomas E.
    Montana, Giovanni
    NEUROIMAGE, 2010, 53 (03) : 1147 - 1159
  • [32] Correction: Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization
    Kohei Yoshikawa
    Shuichi Kawano
    Computational Statistics, 2023, 38 : 77 - 78
  • [33] Efficient estimation of reduced-rank partial envelope model in multivariate linear regression
    Zhang, Jing
    Huang, Zhensheng
    Xiong, Yan
    RANDOM MATRICES-THEORY AND APPLICATIONS, 2021, 10 (02)
  • [34] The APT model as reduced-rank regression
    Bekker, P
    Dobbelstein, P
    Wansbeek, T
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 1996, 14 (02) : 199 - 202
  • [35] Online Robust Reduced-Rank Regression
    Yang, Yangzhuoran Fin
    Zhao, Ziping
    2020 IEEE 11TH SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP (SAM), 2020,
  • [36] Srrr-cluster: Using Sparse Reduced-Rank Regression to Optimize iCluster
    Ge, Shu-Guang
    Xia, Jun-Feng
    Wei, Pi-Jing
    Zheng, Chun-Hou
    INTELLIGENT COMPUTING METHODOLOGIES, ICIC 2016, PT III, 2016, 9773 : 99 - 106
  • [37] Wavelet-Based Sparse Reduced-Rank Regression for Hyperspectral Image Restoration
    Rasti, Behnood
    Sveinsson, Johannes R.
    Ulfarsson, Magnus Orn
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2014, 52 (10): : 6688 - 6698
  • [38] Poisson reduced-rank models with sparse loadings
    Eun Ryung Lee
    Seyoung Park
    Journal of the Korean Statistical Society, 2021, 50 : 1079 - 1097
  • [39] Robust reduced-rank modeling via rank regression
    Zhao, Weihua
    Lian, Heng
    Ma, Shujie
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2017, 180 : 1 - 12
  • [40] Poisson reduced-rank models with sparse loadings
    Lee, Eun Ryung
    Park, Seyoung
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (04) : 1079 - 1097