A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques

被引:0
|
作者
Zhang, Limin [1 ]
Dong, Jing [2 ]
Zhang, Junfang [1 ]
Yang, Junzi [1 ]
机构
[1] Hengshui Univ, Dept Math & Comp Sci, Hengshui 053000, Peoples R China
[2] North China Univ Sci & Technol, Coll Sci, Tangshan 063210, Peoples R China
来源
SYMMETRY-BASEL | 2022年 / 14卷 / 06期
关键词
Stein method; Bayesian variational inference; KL divergence; Bayesian logistic regression; MODEL;
D O I
10.3390/sym14061188
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
This paper introduces a novel variational inference (VI) method with Bayesian and gradient descent techniques. To facilitate the approximation of the posterior distributions for the parameters of the models, the Stein method has been used in Bayesian variational inference algorithms in recent years. Unfortunately, previous methods fail to either explicitly describe the influence of its history in the tracing of particles (Q(x) in this paper) in the approximation, which is important information in the search for particles. In our paper, Q(x) is considered in design of the operator Bp, but the chance of jumping out of the local optimum may be increased, especially in the case of complex distribution. To address the existing issues, a modified Stein variational inference algorithm is proposed, which can make the gradient descent of Kullback-Leibler (KL) divergence more random. In our method, a group of particles are used to approximate target distribution by minimizing the KL divergence, which changes according to the newly defined kernelized Stein discrepancy. Furthermore, the usefulness of the suggested technique is demonstrated by using four data sets. Bayesian logistic regression is considered for classification. Statistical studies such as parameter estimate classification accuracy, F1, NRMSE, and others are used to validate the algorithm's performance.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
    Liu, Qiang
    Wang, Dilin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [2] Riemannian Stein Variational Gradient Descent for Bayesian Inference
    Liu, Chang
    Zhu, Jun
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3627 - 3634
  • [3] Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization
    Gong, Chengyue
    Peng, Jian
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [4] Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models
    Alsup, Terrence
    Hartland, Tucker
    Peherstorfer, Benjamin
    Petra, Noemi
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2024, 50 (04)
  • [5] Multilevel Stein variational gradient descent with applications to Bayesian inverse problems
    Alsup, Terrence
    Venturi, Luca
    Peherstorfer, Benjamin
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 145, 2021, 145 : 93 - +
  • [6] Stein Variational Gradient Descent as Gradient Flow
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [7] Stein Variational Gradient Descent Without Gradient
    Han, Jun
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] On the geometry of Stein variational gradient descent
    Duncan, A.
    Nusken, N.
    Szpruch, L.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [9] Projected Stein Variational Gradient Descent
    Chen, Peng
    Ghattas, Omar
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [10] Grassmann Stein Variational Gradient Descent
    Liu, Xing
    Zhu, Harrison
    Ton, Jean-Francois
    Wynne, George
    Duncan, Andrew
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151