Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference

被引:0
|
作者
Iikubo, Yuji [1 ]
Horii, Shunsuke [2 ]
Matsushima, Toshiyasu [3 ]
机构
[1] Waseda Univ, Res Inst Sci & Engn, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
[2] Waseda Univ, Global Educ Ctr, Shinjuku Ku, 1-6-1 Nishiwaseda, Tokyo 1698050, Japan
[3] Waseda Univ, Dept Appl Math, Shinjuku Ku, 3-4-1 Okubo, Tokyo 1698555, Japan
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The hierarchical mixture of experts (HME) is a treestructured probabilistic model for regression and classification. The HME has a considerable expression capability, however, the estimation of the parameters tends to overfit due to the complexity of the model. To avoid this problem, regularization techniques are widely used. In particular, it is known that a sparse solution can be obtained by L1 regularization. From a Bayesian point of view, regularization techniques are equivalent to assume that the parameters follow prior distributions and find the maximum a posteriori probability estimator. It is known that L1 regularization is equivalent to assuming Laplace distributions as prior distributions. However, it is difficult to compute the posterior distribution if Laplace distributions are assumed. In this paper, we assume that the parameters of the HME follow hierarchical prior distributions which are equivalent to Laplace distribution to promote sparse solutions. We propose a Bayesian estimation algorithm based on the variational method. Finally, the proposed algorithm is evaluated by computer simulations.
引用
收藏
页码:60 / 64
页数:5
相关论文
共 50 条
  • [41] Variational Bayesian Inference for Infinite Dirichlet Mixture Towards Accurate Data Categorization
    Lai, Yuping
    He, Wenda
    Ping, Yuan
    Qu, Jinshuai
    Zhang, Xiufeng
    WIRELESS PERSONAL COMMUNICATIONS, 2018, 102 (03) : 2307 - 2329
  • [42] Dirichlet process mixture model based nonparametric Bayesian modeling and variational inference
    Fei, Zhengshun
    Liu, Kangling
    Huang, Bingqiang
    Zheng, Yongping
    Xiang, Xinjian
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3048 - 3051
  • [43] Variational Bayesian Inference for Infinite Dirichlet Mixture Towards Accurate Data Categorization
    Yuping Lai
    Wenda He
    Yuan Ping
    Jinshuai Qu
    Xiufeng Zhang
    Wireless Personal Communications, 2018, 102 : 2307 - 2329
  • [44] Wideband DOA Estimation Utilizing a Hierarchical Prior Based on Variational Bayesian Inference
    Li, Ninghui
    Zhang, Xiaokuan
    Zong, Binfeng
    Lv, Fan
    Xu, Jiahua
    Wang, Zhaolong
    ELECTRONICS, 2023, 12 (14)
  • [45] Forecasting using variational Bayesian inference in large vector autoregressions with hierarchical shrinkage
    Gefang, Deborah
    Koop, Gary
    Poon, Aubrey
    INTERNATIONAL JOURNAL OF FORECASTING, 2023, 39 (01) : 346 - 363
  • [46] Hierarchical Bayesian inference for Ill-posed problems via variational method
    Jin, Bangti
    Zou, Jun
    JOURNAL OF COMPUTATIONAL PHYSICS, 2010, 229 (19) : 7317 - 7343
  • [47] An analytically tractable solution for hierarchical Bayesian model updating with variational inference scheme
    Jia, Xinyu
    Yan, Wang-Ji
    Papadimitriou, Costas
    Yuen, Ka-Veng
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2023, 189
  • [48] CATVI: Conditional and Adaptively Truncated Variational Inference for Hierarchical Bayesian Nonparametric Models
    Liu, Yirui
    Qiao, Xinghao
    Lam, Jessica
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [49] Variational Bayesian Inference Techniques
    Seeger, Matthias W.
    Wipf, David P.
    IEEE SIGNAL PROCESSING MAGAZINE, 2010, 27 (06) : 81 - 91
  • [50] A tutorial on variational Bayesian inference
    Fox, Charles W.
    Roberts, Stephen J.
    ARTIFICIAL INTELLIGENCE REVIEW, 2012, 38 (02) : 85 - 95