Challenges and Opportunities in High-dimensional Variational Inference

被引:0
|
作者
Dhaka, Akash Kumar [1 ,2 ]
Catalina, Alejandro [1 ]
Welandawe, Manushi [3 ]
Andersen, Michael Riis [4 ]
Huggins, Jonathan H. [3 ]
Vehtari, Aki [1 ]
机构
[1] Aalto Univ, Espoo, Finland
[2] Silo AI, Helsinki, Finland
[3] Boston Univ, Boston, MA 02215 USA
[4] Tech Univ Denmark, Lyngby, Denmark
基金
芬兰科学院;
关键词
APPROXIMATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Current black-box variational inference (BBVI) methods require the user to make numerous design choices-such as the selection of variational objective and approximating family-yet there is little principled guidance on how to do so. We develop a conceptual framework and set of experimental tools to understand the effects of these choices, which we leverage to propose best practices for maximizing posterior approximation accuracy. Our approach is based on studying the pre-asymptotic tail behavior of the density ratios between the joint distribution and the variational approximation, then exploiting insights and tools from the importance sampling literature. Our framework and supporting experiments help to distinguish between the behavior of BBVI methods for approximating low-dimensional versus moderate-to-high-dimensional posteriors. In the latter case, we show that mass-covering variational objectives are difficult to optimize and do not improve accuracy, but flexible variational families can improve accuracy and the effectiveness of importance sampling-at the cost of additional optimization challenges. Therefore, for moderate-to-high-dimensional posteriors we recommend using the (mode-seeking) exclusive KL divergence since it is the easiest to optimize, and improving the variational family or using model parameter transformations to make the posterior and optimal variational approximation more similar. On the other hand, in low-dimensional settings, we show that heavy-tailed variational families and mass-covering divergences are effective and can increase the chances that the approximation can be improved by importance sampling.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Inference for High-Dimensional Exchangeable Arrays
    Chiang, Harold D.
    Kato, Kengo
    Sasaki, Yuya
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (543) : 1595 - 1605
  • [22] High-dimensional empirical likelihood inference
    Chang, Jinyuan
    Chen, Song Xi
    Tang, Cheng Yong
    Wu, Tong Tong
    BIOMETRIKA, 2021, 108 (01) : 127 - 147
  • [23] High-Dimensional Fuzzy Inference Systems
    Xue, Guangdong
    Wang, Jian
    Zhang, Kai
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2024, 54 (01): : 507 - 519
  • [24] Advanced graph deep learning for High-dimensional image analysis: challenges and opportunities
    Hanachi, Refka
    Sellami, Akrem
    Farah, Imed Riadh
    Dalla Mura, Mauro
    2024 IEEE 7TH INTERNATIONAL CONFERENCE ON ADVANCED TECHNOLOGIES, SIGNAL AND IMAGE PROCESSING, ATSIP 2024, 2024, : 488 - 493
  • [25] Sparse Markov Models for High-dimensional Inference
    Ost, Guilherme
    Takahashi, Daniel Y.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [26] On High-Dimensional Constrained Maximum Likelihood Inference
    Zhu, Yunzhang
    Shen, Xiaotong
    Pan, Wei
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2020, 115 (529) : 217 - 230
  • [27] Inference for High-Dimensional Streamed Longitudinal Data
    Senyuan Zheng
    Ling Zhou
    Acta Mathematica Sinica,English Series, 2025, (02) : 757 - 779
  • [28] Inference of Breakpoints in High-dimensional Time Series
    Chen, Likai
    Wang, Weining
    Wu, Wei Biao
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2022, 117 (540) : 1951 - 1963
  • [29] HIGH-DIMENSIONAL INFERENCE FOR DYNAMIC TREATMENT EFFECTS
    Bradic, Jelena
    Ji, Weijie
    Zhang, Yuqian
    ANNALS OF STATISTICS, 2024, 52 (02): : 415 - 440
  • [30] Comments on: High-dimensional simultaneous inference with the bootstrap
    Loffler, Matthias
    Nickl, Richard
    TEST, 2017, 26 (04) : 731 - 733