A Contrastive Divergence for Combining Variational Inference and MCMC

被引:0
|
作者
Ruiz, Francisco J. R. [1 ,2 ]
Titsias, Michalis K. [3 ]
机构
[1] Univ Cambridge, Cambridge, England
[2] Columbia Univ, New York, NY 10027 USA
[3] DeepMind, London, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We develop a method to combine Markov chain Monte Carlo (MCMC) and variational inference (VI), leveraging the advantages of both inference approaches. Specifically, we improve the variational distribution by running a few MCMC steps. To make inference tractable, we introduce the variational contrastive divergence (VCD), a new divergence that replaces the standard Kullback-Leibler (KL) divergence used in VI. The VCD captures a notion of discrepancy between the initial variational distribution and its improved version (obtained after running the MCMC steps), and it converges asymptotically to the symmetrized KL divergence between the variational distribution and the posterior of interest. The VCD objective can be optimized efficiently with respect to the variational parameters via stochastic optimization. We show experimentally that optimizing the VCD leads to better predictive performance on two latent variable models: logistic matrix factorization and variational autoencoders (VAEs).
引用
收藏
页数:9
相关论文
共 50 条
  • [1] A Divergence Bound for Hybrids of MCMC and Variational Inference and an Application to Langevin Dynamics and SGVI
    Domke, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [2] Renyi Divergence Variational Inference
    Li, Yingzhen
    Turner, Richard E.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [3] MCMC Variational Inference via Uncorrected Hamiltonian Annealing
    Geffner, Tomas
    Domke, Justin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Variational inference as an alternative to MCMC for parameter estimation and model selection
    Gunapati, Geetakrishnasai
    Jain, Anirudh
    Srijith, P. K.
    Desai, Shantanu
    PUBLICATIONS OF THE ASTRONOMICAL SOCIETY OF AUSTRALIA, 2022, 39
  • [5] Variational Inference MPC using Tsallis Divergence
    Wang, Ziyi
    So, Oswin
    Gibson, Jason
    Vlahov, Bogdan
    Gandhi, Manan S.
    Liu, Guan-Horng
    Theodorou, Evangelos A.
    ROBOTICS: SCIENCE AND SYSTEM XVII, 2021,
  • [6] Monotonic Alpha-divergence Minimisation for Variational Inference
    Daudel, Kamelia
    Douc, Randal
    Roueff, Francois
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [7] Mixture weights optimisation for Alpha-Divergence Variational Inference
    Daudel, Kamelia
    Douc, Randal
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [8] Variational Inference with Tail-adaptive f-Divergence
    Wang, Dilin
    Liu, Hao
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [9] A novel combining classifier method based on Variational Inference
    Tien Thanh Nguyen
    Thi Thu Thuy Nguyen
    Xuan Cuong Pham
    Liew, Alan Wee-Chung
    PATTERN RECOGNITION, 2016, 49 : 198 - 212
  • [10] Simple variational inference based on minimizing Kullback-Leibler divergence
    Nakamura, Ryo
    Yuasa, Tomooki
    Amaba, Takafumi
    Fujiki, Jun
    INFORMATION GEOMETRY, 2024, 7 (02) : 449 - 470