Convergence analysis of a collapsed Gibbs sampler for Bayesian vector autoregressions

被引:5
|
作者
Ekvall, Karl Oskar [1 ]
Jones, Galin L. [2 ]
机构
[1] Karolinska Inst, Inst Environm Med, Div Biostat, Nobels Vag 13, S-17177 Stockholm, Sweden
[2] Univ Minnesota, Sch Stat, 313 Ford Hall,224 Church St SE, Minneapolis, MN 55455 USA
来源
ELECTRONIC JOURNAL OF STATISTICS | 2021年 / 15卷 / 01期
基金
奥地利科学基金会;
关键词
Convergence complexity analysis; geometric ergodicity; Markov chain Monte Carlo; Bayesian vector autoregression; Gibbs sampler; SPECTRAL VARIANCE ESTIMATORS; CHAIN MONTE-CARLO;
D O I
10.1214/21-EJS1800
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the convergence properties of a collapsed Gibbs sampler for Bayesian vector autoregressions with predictors, or exogenous variables. The Markov chain generated by our algorithm is shown to be geometrically ergodic regardless of whether the number of observations in the underlying vector autoregression is small or large in comparison to the order and dimension of it. In a convergence complexity analysis, we also give conditions for when the geometric ergodicity is asymptotically stable as the number of observations tends to infinity. Specifically, the geometric convergence rate is shown to be bounded away from unity asymptotically, either almost surely or with probability tending to one, depending on what is assumed about the data generating process. This result is one of the first of its kind for practically relevant Markov chain Monte Carlo algorithms. Our convergence results hold under close to arbitrary model misspecification.
引用
收藏
页码:691 / 721
页数:31
相关论文
共 50 条
  • [1] A Gibbs sampler for structural vector autoregressions
    Waggoner, DF
    Zha, T
    JOURNAL OF ECONOMIC DYNAMICS & CONTROL, 2003, 28 (02): : 349 - 366
  • [2] Bayesian variable selection for latent class analysis using a collapsed Gibbs sampler
    White, Arthur
    Wyse, Jason
    Murphy, Thomas Brendan
    STATISTICS AND COMPUTING, 2016, 26 (1-2) : 511 - 527
  • [3] Bayesian variable selection for latent class analysis using a collapsed Gibbs sampler
    Arthur White
    Jason Wyse
    Thomas Brendan Murphy
    Statistics and Computing, 2016, 26 : 511 - 527
  • [4] A Partially Collapsed Gibbs Sampler with Accelerated Convergence for EEG Source Localization
    Costa, Facundo
    Batatia, Hadj
    Oberlin, Thomas
    Tourneret, Jean-Yves
    2016 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2016,
  • [6] Bounding the convergence time of the Gibbs sampler in Bayesian image restoration
    Gibbs, AL
    BIOMETRIKA, 2000, 87 (04) : 749 - 766
  • [7] CONVERGENCE ANALYSIS OF THE GIBBS SAMPLER FOR BAYESIAN GENERAL LINEAR MIXED MODELS WITH IMPROPER PRIORS
    Roman, Jorge Carlos
    Hobert, James P.
    ANNALS OF STATISTICS, 2012, 40 (06): : 2823 - 2849
  • [8] Bayesian Vector Autoregressions
    Wozniak, Tomasz
    AUSTRALIAN ECONOMIC REVIEW, 2016, 49 (03) : 365 - 380
  • [9] A fast collapsed Gibbs sampler for frequency domain operational modal analysis
    Dollon, Quentin
    Antoni, Jerome
    Tahan, Antoine
    Gagnon, Martin
    Monette, Christine
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2022, 173
  • [10] Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
    Wang, Xin
    Roy, Vivekananda
    ELECTRONIC JOURNAL OF STATISTICS, 2018, 12 (02): : 4412 - 4439