Gaussian variational approximation with sparse precision matrices

被引:0
|
作者
Linda S. L. Tan
David J. Nott
机构
[1] National University of Singapore,Department of Statistics and Applied Probability
[2] National University of Singapore,Operations Research and Analytics Cluster
来源
Statistics and Computing | 2018年 / 28卷
关键词
Gaussian variational approximation; Stochastic gradient algorithms; Sparse precision matrix; Variational Bayes;
D O I
暂无
中图分类号
学科分类号
摘要
We consider the problem of learning a Gaussian variational approximation to the posterior distribution for a high-dimensional parameter, where we impose sparsity in the precision matrix to reflect appropriate conditional independence structure in the model. Incorporating sparsity in the precision matrix allows the Gaussian variational distribution to be both flexible and parsimonious, and the sparsity is achieved through parameterization in terms of the Cholesky factor. Efficient stochastic gradient methods that make appropriate use of gradient information for the target distribution are developed for the optimization. We consider alternative estimators of the stochastic gradients, which have lower variation and are more stable. Our approach is illustrated using generalized linear mixed models and state-space models for time series.
引用
收藏
页码:259 / 275
页数:16
相关论文
共 50 条
  • [21] Sparse precision matrices for minimum variance portfolios
    Torri, Gabriele
    Giacometti, Rosella
    Paterlini, Sandra
    COMPUTATIONAL MANAGEMENT SCIENCE, 2019, 16 (03) : 375 - 400
  • [22] Sparse representation of precision matrices used in GMMs
    Brkljac, Branko
    Janev, Marko
    Obradovic, Radovan
    Rapaic, Danilo
    Ralevic, Nebojsa
    Crnojevic, Vladimir
    APPLIED INTELLIGENCE, 2014, 41 (03) : 956 - 973
  • [23] Sparse precision matrices for minimum variance portfolios
    Gabriele Torri
    Rosella Giacometti
    Sandra Paterlini
    Computational Management Science, 2019, 16 : 375 - 400
  • [24] Sparse representation of precision matrices used in GMMs
    Branko Brkljač
    Marko Janev
    Radovan Obradović
    Danilo Rapaić
    Nebojša Ralević
    Vladimir Crnojević
    Applied Intelligence, 2014, 41 : 956 - 973
  • [25] Sparse Approximation for Gaussian Process with Derivative Observations
    Yang, Ang
    Li, Cheng
    Rana, Santu
    Gupta, Sunil
    Venkatesh, Svetha
    AI 2018: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, 11320 : 507 - 518
  • [26] Variational inference for sparse spectrum Gaussian process regression
    Tan, Linda S. L.
    Ong, Victor M. H.
    Nott, David J.
    Jasra, Ajay
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1243 - 1261
  • [27] Variational Sparse Inverse Cholesky Approximation for Latent Gaussian Processes via Double Kullback-Leibler Minimization
    Cao, Jian
    Kang, Myeongjong
    Jimenez, Felix
    Sang, Huiyan
    Schafer, Florian
    Katzfuss, Matthias
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [28] A GREEDY ALGORITHM FOR SPARSE PRECISION MATRIX APPROXIMATION
    Lv, Didi
    Zhang, Xiaoqun
    JOURNAL OF COMPUTATIONAL MATHEMATICS, 2021, 39 (05): : 655 - 669
  • [29] Variational inference for sparse spectrum Gaussian process regression
    Linda S. L. Tan
    Victor M. H. Ong
    David J. Nott
    Ajay Jasra
    Statistics and Computing, 2016, 26 : 1243 - 1261
  • [30] Inpainting method based on variational calculus and sparse matrices
    Forero, Manuel G.
    Navarro, Andres F.
    Miranda, Sergio L.
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XLIV, 2021, 11842