Projective Integral Updates for High-Dimensional Variational Inference

被引:0
|
作者
Duersch, Jed A. [1 ]
机构
[1] Sandia Natl Labs, Livermore, CA 94550 USA
来源
关键词
Key words. variational inference; Gaussian mean-field; Hessian approximation; quasi-Newton; spike-and-slab; quadrature; cubature; Hadamard basis; CUBATURE; QUADRATURE;
D O I
10.1137/22M1529919
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Variational inference is an approximation framework for Bayesian inference that seeks to improve quantified uncertainty in predictions by optimizing a simplified distribution over parameters to stand in for the full posterior. Capturing model variations that remain consistent with training data enables more robust predictions by reducing parameter sensitivity. This work introduces a fixedpoint optimization for variational inference that is applicable when every feasible log density can be expressed as a linear combination of functions from a given basis. In such cases, the optimizer becomes a fixed-point of projective integral updates. When the basis spans univariate quadratics in each parameter, the feasible distributions are Gaussian mean-fields and the projective integral updates yield quasi-Newton variational Bayes (QNVB). Other bases and updates are also possible. Since these updates require high-dimensional integration, this work begins by proposing an efficient quasirandom sequence of quadratures for mean-field distributions. Each iterate of the sequence contains two evaluation points that combine to correctly integrate all univariate quadratic functions and, if the mean-field factors are symmetric, all univariate cubics. More importantly, averaging results over short subsequences achieves periodic exactness on a much larger space of multivariate polynomials of quadratic total degree. The corresponding variational updates require four loss evaluations with standard (not second-order) backpropagation to eliminate error terms from over half of all multivariate quadratic basis functions. This integration technique is motivated by first proposing stochastic blocked mean-field quadratures, which may be useful in other contexts. A PyTorch implementation of QNVB allows for better control over model uncertainty during training than competing methods. Experiments demonstrate superior generalizability for multiple learning problems and architectures.
引用
收藏
页码:69 / 100
页数:32
相关论文
共 50 条
  • [1] Implicit Variational Inference for High-Dimensional Posteriors
    Uppal, Anshuk
    Stensbo-Smidt, Kristoffer
    Boomsma, Wouter
    Frellsen, Jes
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Variational Inference in high-dimensional linear regression
    Mukherjee, Sumit
    Sen, Subhabrata
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [3] Variational Inference in high-dimensional linear regression
    Mukherjee, Sumit
    Sen, Subhabrata
    Journal of Machine Learning Research, 2022, 23
  • [4] Challenges and Opportunities in High-dimensional Variational Inference
    Dhaka, Akash Kumar
    Catalina, Alejandro
    Welandawe, Manushi
    Andersen, Michael Riis
    Huggins, Jonathan H.
    Vehtari, Aki
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Projective inference in high-dimensional problems: Prediction and feature selection
    Piironen, Juho
    Paasiniemi, Markus
    Vehtari, Aki
    ELECTRONIC JOURNAL OF STATISTICS, 2020, 14 (01): : 2155 - 2197
  • [6] Variational Bayesian Inference in High-Dimensional Linear Mixed Models
    Yi, Jieyi
    Tang, Niansheng
    MATHEMATICS, 2022, 10 (03)
  • [7] Online Variational Bayes Inference for High-Dimensional Correlated Data
    Kabisa, Sylvie
    Dunson, David B.
    Morris, Jeffrey S.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2016, 25 (02) : 426 - 444
  • [8] Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Lucas Kock
    Nadja Klein
    David J. Nott
    Statistics and Computing, 2022, 32
  • [9] Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Kock, Lucas
    Klein, Nadja
    Nott, David J.
    STATISTICS AND COMPUTING, 2022, 32 (05)
  • [10] Stabilizing training of affine coupling layers for high-dimensional variational inference
    Andrade, Daniel
    Machine Learning: Science and Technology, 2024, 5 (04):