Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference

被引:0
|
作者
Iikubo, Yuji [1 ]
Horii, Shunsuke [2 ]
Matsushima, Toshiyasu [3 ]
机构
[1] Waseda Univ, Res Inst Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
[2] Waseda Univ, Global Educ Ctr, Shinjuku Ku, 1-6-1 Nishiwaseda, Tokyo 1698050, Japan
[3] Waseda Univ, Dept Appl Math, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The hierarchical mixture of experts (HME) is a treestructured probabilistic model for regression and classification. The HME has a considerable expression capability, however, the estimation of the parameters tends to overfit due to the complexity of the model. To avoid this problem, regularization techniques are widely used. In particular, it is known that a sparse solution can be obtained by L1 regularization. From a Bayesian point of view, regularization techniques are equivalent to assume that the parameters follow prior distributions and find the maximum a posteriori probability estimator. It is known that L1 regularization is equivalent to assuming Laplace distributions as prior distributions. However, it is difficult to compute the posterior distribution if Laplace distributions are assumed. In this paper, we assume that the parameters of the HME follow hierarchical prior distributions which are equivalent to Laplace distribution to promote sparse solutions. We propose a Bayesian estimation algorithm based on the variational method. Finally, the proposed algorithm is evaluated by computer simulations.
引用
收藏
页码:60 / 64
页数:5
相关论文
共 50 条
  • [21] Fast Variational Bayesian Inference for Temporally Correlated Sparse Signal Recovery
    Cao, Zheng
    Dai, Jisheng
    Xu, Weichao
    Chang, Chunqi
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 214 - 218
  • [22] Variational Bayesian Inference for Sparse Representation of Migrating Targets in Wideband radar
    Bidon, Stephanie
    Tamalet, Anais
    Tourneret, Jean-Yves
    2013 IEEE RADAR CONFERENCE (RADAR), 2013,
  • [23] Sparse linear models: Variational approximate inference and Bayesian experimental design
    Seeger, Matthias W.
    INTERNATIONAL WORKSHOP ON STATISTICAL-MECHANICAL INFORMATICS 2009 (IW-SMI 2009), 2009, 197
  • [24] Visual Tracking with Sparse Prototypes: An Approach Based on Variational Bayesian Inference
    Hu, Lei
    Wang, Jun
    Wu, Zemin
    Zhang, Lei
    2018 IEEE 3RD INTERNATIONAL CONFERENCE ON IMAGE, VISION AND COMPUTING (ICIVC), 2018, : 560 - 565
  • [25] Sparse Variational Bayesian Inference for Water Pipeline Systems With Parameter Uncertainties
    Zhou, Bingpeng
    Liu, An
    Lau, Vincent K. N.
    IEEE ACCESS, 2018, 6 : 49664 - 49678
  • [26] Merging experts' opinions: A Bayesian hierarchical model with mixture of prior distributions
    Rufo, M. J.
    Perez, C. J.
    Martin, J.
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2010, 207 (01) : 284 - 289
  • [27] HyperRouter: Towards Efficient Training and Inference of Sparse Mixture of Experts via HyperNetwork
    Do, Giang
    Le, Khiem
    Pham, Quang
    TrungTin Nguyen
    Thanh-Nam Doan
    Nguyen, Binh T.
    Liu, Chenghao
    Ramasamy, Savitha
    Li, Xiaoli
    Hoi, Steven
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 5754 - 5765
  • [28] Variational Bayesian inference for a Dirichlet process mixture of beta distributions and application
    Lai, Yuping
    Ping, Yuan
    Xiao, Ke
    Hao, Bin
    Zhang, Xiufeng
    NEUROCOMPUTING, 2018, 278 : 23 - 33
  • [29] Scalable Inverse Uncertainty Quantification by Hierarchical Bayesian Modeling and Variational Inference
    Wang, Chen
    Wu, Xu
    Xie, Ziyu
    Kozlowski, Tomasz
    ENERGIES, 2023, 16 (22)
  • [30] Variational Bayesian mixture of experts models and sensitivity analysis for nonlinear dynamical systems
    Baldacchino, Tara
    Cross, Elizabeth J.
    Worden, Keith
    Rowson, Jennifer
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2016, 66-67 : 178 - 200