Bayesian model search for mixture models based on optimizing variational bounds

被引:89
|
作者
Ueda, N
Ghahramani, Z
机构
[1] NTT Corp, Commun Sci Lab, Kyoto 6190237, Japan
[2] UCL, Gatsby Computat Neurosci Unit, London WC1N 3AR, England
关键词
variational Bayes; Bayesian model search; mixture models; mixture of experts models; EM algorithm;
D O I
10.1016/S0893-6080(02)00040-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When learning a mixture model, we suffer from the local optima and model structure determination problems. In this paper, we present a method for simultaneously solving these problems based on the variational Bayesian (VB) framework. First, in the VB framework, we derive an objective function that can simultaneously optimize both model parameter distributions and model structure. Next, focusing on mixture models, we present a deterministic algorithm to approximately optimize the objective function by using the idea of the split and merge operations which we previously proposed within the maximum likelihood framework. Then, we apply the method to mixture of expers (MoE) models to experimentally show that the proposed method can find the optimal number of experts of a MoE while avoiding local maxima. (C) 2002 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:1223 / 1241
页数:19
相关论文
共 50 条
  • [1] BAYESIAN LINEAR REGRESSION FOR HIDDEN MARKOV MODEL BASED ON OPTIMIZING VARIATIONAL BOUNDS
    Watanabe, Shinji
    Nakamura, Atsushi
    Luang, Biing-Hwang
    2011 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2011,
  • [2] Variational Bayesian Mixture of Robust CCA Models
    Viinikanoja, Jaakko
    Klami, Arto
    Kaski, Samuel
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2010, 6323 : 370 - 385
  • [3] Asymmetric Mixture Model with Variational Bayesian Learning
    Thanh Minh Nguyen
    Wu, Q. M. Jonathan
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 285 - 290
  • [4] PARAMETER ESTIMATION OF GAUSSIAN MIXTURE MODEL BASED ON VARIATIONAL BAYESIAN LEARNING
    Zhao, Linchang
    Shang, Zhaowei
    Qin, Anyong
    Tang, Yuan Yan
    PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOL 1, 2018, : 99 - 104
  • [5] GPGPU Implementation of Variational Bayesian Gaussian Mixture Models
    Nishimoto, Hiroki
    Zhang, Renyuan
    Nakashima, Yasuhiko
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (03) : 611 - 622
  • [6] Streaming Variational Inference for Bayesian Nonparametric Mixture Models
    Tank, Alex
    Foti, Nicholas J.
    Fox, Emily B.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 968 - 976
  • [7] Bayesian Estimation of Beta Mixture Models with Variational Inference
    Ma, Zhanyu
    Leijon, Arne
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (11) : 2160 - 2173
  • [8] Model Selection of Bayesian Hierarchical Mixture of Experts based on Variational Inference
    Iikubo, Yuji
    Horii, Shunsuke
    Matsushima, Toshiyasu
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 3474 - 3479
  • [9] GPGPU Implementation of Variational Bayesian Gaussian Mixture Models
    Nishimoto, Hiroki
    Nakada, Takashi
    Nakashima, Yasuhiko
    2019 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING (CANDAR 2019), 2019, : 185 - 190
  • [10] Variational bayesian feature selection for Gaussian mixture models
    Valente, F
    Wellekens, C
    2004 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL I, PROCEEDINGS: SPEECH PROCESSING, 2004, : 513 - 516