Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference

被引:0
|
作者
Iikubo, Yuji [1 ]
Horii, Shunsuke [2 ]
Matsushima, Toshiyasu [3 ]
机构
[1] Waseda Univ, Res Inst Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
[2] Waseda Univ, Global Educ Ctr, Shinjuku Ku, 1-6-1 Nishiwaseda, Tokyo 1698050, Japan
[3] Waseda Univ, Dept Appl Math, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The hierarchical mixture of experts (HME) is a treestructured probabilistic model for regression and classification. The HME has a considerable expression capability, however, the estimation of the parameters tends to overfit due to the complexity of the model. To avoid this problem, regularization techniques are widely used. In particular, it is known that a sparse solution can be obtained by L1 regularization. From a Bayesian point of view, regularization techniques are equivalent to assume that the parameters follow prior distributions and find the maximum a posteriori probability estimator. It is known that L1 regularization is equivalent to assuming Laplace distributions as prior distributions. However, it is difficult to compute the posterior distribution if Laplace distributions are assumed. In this paper, we assume that the parameters of the HME follow hierarchical prior distributions which are equivalent to Laplace distribution to promote sparse solutions. We propose a Bayesian estimation algorithm based on the variational method. Finally, the proposed algorithm is evaluated by computer simulations.
引用
收藏
页码:60 / 64
页数:5
相关论文
共 50 条
  • [31] HIERARCHICAL LEARNING OF SPARSE IMAGE REPRESENTATIONS USING STEERED MIXTURE-OF-EXPERTS
    Jongebloed, Rolf
    Verhack, Ruben
    Lange, Lieven
    Sikora, Thomas
    2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW 2018), 2018,
  • [32] Bayesian inference in mixtures-of-experts and hierarchical mixtures-of-experts models with an application to speech recognition
    Peng, FC
    Jacobs, RA
    Tanner, MA
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1996, 91 (435) : 953 - 960
  • [33] Clustering sparse binary data with hierarchical Bayesian Bernoulli mixture model
    Ye, Mao
    Zhang, Peng
    Nie, Lizhen
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2018, 123 : 32 - 49
  • [34] SPARSE BAYESIAN LEARNING USING VARIATIONAL BAYES INFERENCE BASED ON A GREEDY CRITERION
    Shekaramiz, Mohammad
    Moon, Todd K.
    Gunther, Jacob H.
    2017 FIFTY-FIRST ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2017, : 858 - 862
  • [35] Robust Variational Bayesian Inference for Direction-of-Arrival Estimation With Sparse Array
    Liu, Ying
    Zhang, Zongyu
    Zhou, Chengwei
    Yan, Chenggang
    Shi, Zhiguo
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (08) : 8591 - 8602
  • [36] Variational Bayesian inference based robust multiple measurement sparse signal recovery
    Wang, Dan
    Zhang, Zhuhong
    DIGITAL SIGNAL PROCESSING, 2019, 89 : 131 - 144
  • [37] Hierarchical Routing Mixture of Experts
    Zhao, Wenbo
    Gao, Yang
    Memon, Shahan Ali
    Raj, Bhiksha
    Singh, Rita
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7900 - 7906
  • [38] Variational Sparse Bayesian Learning for Estimation of Gaussian Mixture Distributed Wireless Channels
    Kong, Lingjin
    Zhang, Xiaoying
    Zhao, Haitao
    Wei, Jibo
    ENTROPY, 2021, 23 (10)
  • [39] Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference
    Chen, Peng
    Zabaras, Nicholas
    Bilionis, Ilias
    JOURNAL OF COMPUTATIONAL PHYSICS, 2015, 284 : 291 - 333
  • [40] Bayesian Estimation of the von-Mises Fisher Mixture Model with Variational Inference
    Taghia, Jalil
    Ma, Zhanyu
    Leijon, Arne
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2014, 36 (09) : 1701 - 1715