Scalable Bayesian High-dimensional Local Dependence Learning

被引:1
|
作者
Lee, Kyoungjae [1 ]
Lin, Lizhen [2 ]
机构
[1] Sungkyunkwan Univ, Dept Stat, Seoul, South Korea
[2] Univ Notre Dame, Dept Appl & Computat Math & Stat, Notre Dame, IN USA
来源
BAYESIAN ANALYSIS | 2023年 / 18卷 / 01期
基金
新加坡国家研究基金会;
关键词
selection consistency; optimal posterior convergence rate; varying bandwidth; POSTERIOR CONVERGENCE-RATES; LARGE PRECISION MATRICES; CONSISTENCY; SELECTION; MODELS; LIKELIHOOD; SPARSITY;
D O I
10.1214/21-BA1299
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this work, we propose a scalable Bayesian procedure for learning the local dependence structure in a high-dimensional model where the variables possess a natural ordering. The ordering of variables can be indexed by time, the vicinities of spatial locations, and so on, with the natural assumption that variables far apart tend to have weak correlations. Applications of such models abound in a variety of fields such as finance, genome associations analysis and spatial model-ing. We adopt a flexible framework under which each variable is dependent on its neighbors or predecessors, and the neighborhood size can vary for each variable. It is of great interest to reveal this local dependence structure by estimating the covariance or precision matrix while yielding a consistent estimate of the varying neighborhood size for each variable. The existing literature on banded covariance matrix estimation, which assumes a fixed bandwidth cannot be adapted for this general setup. We employ the modified Cholesky decomposition for the precision matrix and design a flexible prior for this model through appropriate priors on the neighborhood sizes and Cholesky factors. The posterior contraction rates of the Cholesky factor are derived which are nearly or exactly minimax optimal, and our procedure leads to consistent estimates of the neighborhood size for all the vari-ables. Another appealing feature of our procedure is its scalability to models with large numbers of variables due to efficient posterior inference without resorting to MCMC algorithms. Numerical comparisons are carried out with competitive methods, and applications are considered for some real datasets.
引用
收藏
页码:25 / 47
页数:23
相关论文
共 50 条
  • [21] Batched High-dimensional Bayesian Optimization via Structural Kernel Learning
    Wang, Zi
    Li, Chengtao
    Jegelka, Stefanie
    Kohli, Pushmeet
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [22] High-Dimensional Bayesian Optimization with Multi-Task Learning for RocksDB
    Alabed, Sami
    Yoneki, Eiko
    PROCEEDINGS OF THE 1ST WORKSHOP ON MACHINE LEARNING AND SYSTEMS (EUROMLSYS'21), 2021, : 111 - 119
  • [23] Bayesian and High-Dimensional Global Optimization
    Sergeyev, Yaroslav D.
    OPTIMIZATION LETTERS, 2021, 15 (08) : 2897 - 2899
  • [24] Bayesian Optimization with High-Dimensional Outputs
    Maddox, Wesley J.
    Balandat, Maximilian
    Wilson, Andrew Gordon
    Bakshy, Eytan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] Fast and scalable learning of sparse changes in high-dimensional graphical model structure
    Wang, Beilun
    Zhang, Jiaqi
    Xu, Haoqing
    Tao, Te
    NEUROCOMPUTING, 2022, 514 : 39 - 57
  • [26] A SCALABLE DEEP LEARNING APPROACH FOR SOLVING HIGH-DIMENSIONAL DYNAMIC OPTIMAL TRANSPORT
    Wan, Wei
    Zhang, Yuejin
    Bao, Chenglong
    Dong, Bin
    Shi, Zuoqiang
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (04): : B544 - B563
  • [27] SCALABLE BAYESIAN REDUCED-ORDER MODELS FOR SIMULATING HIGH-DIMENSIONAL MULTISCALE DYNAMICAL SYSTEMS
    Koutsourelakis, Phaedon-Stelios
    Bilionis, Elias
    MULTISCALE MODELING & SIMULATION, 2011, 9 (01): : 449 - 485
  • [28] Scalable high-dimensional Bayesian varying coefficient models with unknown within-subject covariance
    Bai, Ray
    Boland, Mary R.
    Chen, Yong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [29] Scalable inference for high-dimensional precision matrix
    Zheng, Zemin
    Wang, Yue
    Yu, Yugang
    Li, Yang
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (23) : 8205 - 8224
  • [30] Decompositions of dependence for high-dimensional extremes
    Cooley, D.
    Thibaud, E.
    BIOMETRIKA, 2019, 106 (03) : 587 - 604