Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference

被引:0
|
作者
Iikubo, Yuji [1 ]
Horii, Shunsuke [2 ]
Matsushima, Toshiyasu [3 ]
机构
[1] Waseda Univ, Res Inst Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
[2] Waseda Univ, Global Educ Ctr, Shinjuku Ku, 1-6-1 Nishiwaseda, Tokyo 1698050, Japan
[3] Waseda Univ, Dept Appl Math, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The hierarchical mixture of experts (HME) is a treestructured probabilistic model for regression and classification. The HME has a considerable expression capability, however, the estimation of the parameters tends to overfit due to the complexity of the model. To avoid this problem, regularization techniques are widely used. In particular, it is known that a sparse solution can be obtained by L1 regularization. From a Bayesian point of view, regularization techniques are equivalent to assume that the parameters follow prior distributions and find the maximum a posteriori probability estimator. It is known that L1 regularization is equivalent to assuming Laplace distributions as prior distributions. However, it is difficult to compute the posterior distribution if Laplace distributions are assumed. In this paper, we assume that the parameters of the HME follow hierarchical prior distributions which are equivalent to Laplace distribution to promote sparse solutions. We propose a Bayesian estimation algorithm based on the variational method. Finally, the proposed algorithm is evaluated by computer simulations.
引用
收藏
页码:60 / 64
页数:5
相关论文
共 50 条
  • [1] Model Selection of Bayesian Hierarchical Mixture of Experts based on Variational Inference
    Iikubo, Yuji
    Horii, Shunsuke
    Matsushima, Toshiyasu
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 3474 - 3479
  • [2] SPARSE BAYESIAN HIERARCHICAL MIXTURE OF EXPERTS
    Mossavat, Iman
    Amft, Oliver
    2011 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2011, : 653 - 656
  • [3] Hierarchical Sparse Signal Recovery by Variational Bayesian Inference
    Wang, Lu
    Zhao, Lifan
    Bi, Guoan
    Wan, Chunru
    IEEE SIGNAL PROCESSING LETTERS, 2014, 21 (01) : 110 - 113
  • [4] Optimal model inference for Bayesian mixture of experts
    Ueda, N
    Ghahramani, Z
    NEURAL NETWORKS FOR SIGNAL PROCESSING X, VOLS 1 AND 2, PROCEEDINGS, 2000, : 145 - 154
  • [5] Optimal model inference for Bayesian mixture of experts
    Ueda, Naonori
    Ghahramani, Zoubin
    Neural Networks for Signal Processing - Proceedings of the IEEE Workshop, 2000, 1 : 145 - 154
  • [6] Identification of MISO Hammerstein system using sparse multiple kernel-based hierarchical mixture prior and variational Bayesian inference
    Chen, Xiaolong
    Chai, Yi
    Liu, Qie
    Huang, Pengfei
    Fan, Linchuan
    ISA TRANSACTIONS, 2023, 137 : 323 - 338
  • [7] Sparse Audio Inpainting with Variational Bayesian Inference
    Chantas, Giannis
    Nikolopoulos, Spiros
    Kompatsiaris, Ioannis
    2018 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE), 2018,
  • [8] Bayesian Group-Sparse Modeling and Variational Inference
    Babacan, S. Derin
    Nakajima, Shinichi
    Do, Minh N.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (11) : 2906 - 2921
  • [9] Sparse Variational Inference: Bayesian Coresets from Scratch
    Campbell, Trevor
    Beronov, Boyan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Bayesian estimation of Dirichlet mixture model with variational inference
    Ma, Zhanyu
    Rana, Pravin Kumar
    Taghia, Jalil
    Flierl, Markus
    Leijon, Arne
    PATTERN RECOGNITION, 2014, 47 (09) : 3143 - 3157