A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques

被引:0
|
作者
Zhang, Limin [1 ]
Dong, Jing [2 ]
Zhang, Junfang [1 ]
Yang, Junzi [1 ]
机构
[1] Hengshui Univ, Dept Math & Comp Sci, Hengshui 053000, Peoples R China
[2] North China Univ Sci & Technol, Coll Sci, Tangshan 063210, Peoples R China
来源
SYMMETRY-BASEL | 2022年 / 14卷 / 06期
关键词
Stein method; Bayesian variational inference; KL divergence; Bayesian logistic regression; MODEL;
D O I
10.3390/sym14061188
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
This paper introduces a novel variational inference (VI) method with Bayesian and gradient descent techniques. To facilitate the approximation of the posterior distributions for the parameters of the models, the Stein method has been used in Bayesian variational inference algorithms in recent years. Unfortunately, previous methods fail to either explicitly describe the influence of its history in the tracing of particles (Q(x) in this paper) in the approximation, which is important information in the search for particles. In our paper, Q(x) is considered in design of the operator Bp, but the chance of jumping out of the local optimum may be increased, especially in the case of complex distribution. To address the existing issues, a modified Stein variational inference algorithm is proposed, which can make the gradient descent of Kullback-Leibler (KL) divergence more random. In our method, a group of particles are used to approximate target distribution by minimizing the KL divergence, which changes according to the newly defined kernelized Stein discrepancy. Furthermore, the usefulness of the suggested technique is demonstrated by using four data sets. Bayesian logistic regression is considered for classification. Statistical studies such as parameter estimate classification accuracy, F1, NRMSE, and others are used to validate the algorithm's performance.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Non-Gaussian Parameter Inference for Hydrogeological Models Using Stein Variational Gradient Descent
    Ramgraber, Maximilian
    Weatherl, Robin
    Blumensaat, Frank
    Schirmer, Mario
    WATER RESOURCES RESEARCH, 2021, 57 (04)
  • [22] VAE Learning via Stein Variational Gradient Descent
    Pu, Yunchen
    Gan, Zhe
    Henao, Ricardo
    Li, Chunyuan
    Han, Shaobo
    Carin, Lawrence
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [23] Variational Bayesian Inference Techniques
    Seeger, Matthias W.
    Wipf, David P.
    IEEE SIGNAL PROCESSING MAGAZINE, 2010, 27 (06) : 81 - 91
  • [24] Gradient-free Stein variational gradient descent with kernel approximation
    Yan, Liang
    Zou, Xiling
    APPLIED MATHEMATICS LETTERS, 2021, 121 (121)
  • [25] Stochastic Gradient Descent as Approximate Bayesian Inference
    Mandt, Stephan
    Hoffman, Matthew D.
    Blei, David M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [27] A STOCHASTIC VERSION OF STEIN VARIATIONAL GRADIENT DESCENT FOR EFFICIENT SAMPLING
    Li, Lei
    Li, Yingzhou
    Liu, Jian-Guo
    Liu, Zibu
    Lu, Jianfeng
    COMMUNICATIONS IN APPLIED MATHEMATICS AND COMPUTATIONAL SCIENCE, 2020, 15 (01) : 37 - 63
  • [28] Learning to Draw Samples with Amortized Stein Variational Gradient Descent
    Feng, Yihao
    Wang, Dilin
    Liu, Qiang
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,
  • [29] Density Estimation-Based Stein Variational Gradient Descent
    Kim, Jeongho
    Lee, Byungjoon
    Min, Chohong
    Park, Jaewoo
    Ryu, Keunkwan
    COGNITIVE COMPUTATION, 2025, 17 (01)
  • [30] Stein Variational Gradient Descent with Matrix-Valued Kernels
    Wang, Dilin
    Tang, Ziyang
    Bajaj, Chandrajit
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32