Variational approximations in Bayesian model selection for finite mixture distributions

被引:93
|
作者
McGrory, C. A. [1 ]
Titterington, D. M. [1 ]
机构
[1] Univ Glasgow, Glasgow G12 8QQ, Lanark, Scotland
基金
英国工程与自然科学研究理事会;
关键词
Bayesian analysis; deviance information criterion (DIC); mixtures; variational approximations;
D O I
10.1016/j.csda.2006.07.020
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Variational methods, which have become popular in the neural computing/machine learning literature, are applied to the Bayesian analysis of mixtures of Gaussian distributions. It is also shown how the deviance information criterion, (DIC), can be extended to these types of model by exploiting the use of variational approximations. The use of variational methods for model selection and the calculation of a DIC are illustrated with real and simulated data. The variational approach allows the simultaneous estimation of the component parameters and the model complexity. It is found that initial selection of a large number of components results in superfluous components being eliminated as the method converges to a solution. This corresponds to an automatic choice of model complexity. The appropriateness of this is reflected in the DIC values. (C) 2006 Elsevier B.V. All rights reserved.
引用
收藏
页码:5352 / 5367
页数:16
相关论文
共 50 条
  • [31] Bayesian feature and model selection for Gaussian mixture models
    Constantinopoulos, C
    Titsias, MK
    Likas, A
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2006, 28 (06) : 1013 - U1
  • [32] Bayesian mixture model estimation of aerosol particle size distributions
    Wraith, D.
    Alston, C.
    Mengersen, K.
    Hussein, T.
    ENVIRONMETRICS, 2011, 22 (01) : 23 - 34
  • [33] Copula, marginal distributions and model selection: a Bayesian note
    Ralph dos Santos Silva
    Hedibert Freitas Lopes
    Statistics and Computing, 2008, 18 : 313 - 320
  • [34] Copula, marginal distributions and model selection: a Bayesian note
    Silva, Ralph dos Santos
    Lopes, Hedibert Freitas
    STATISTICS AND COMPUTING, 2008, 18 (03) : 313 - 320
  • [35] PARAMETER ESTIMATION OF GAUSSIAN MIXTURE MODEL BASED ON VARIATIONAL BAYESIAN LEARNING
    Zhao, Linchang
    Shang, Zhaowei
    Qin, Anyong
    Tang, Yuan Yan
    PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOL 1, 2018, : 99 - 104
  • [36] Bayesian model search for mixture models based on optimizing variational bounds
    Ueda, N
    Ghahramani, Z
    NEURAL NETWORKS, 2002, 15 (10) : 1223 - 1241
  • [37] Simultaneous Clustering and Dimensionality Reduction Using Variational Bayesian Mixture Model
    Watanabe, Kazuho
    Akaho, Shotaro
    Omachi, Shinichiro
    Okada, Masato
    CLASSIFICATION AS A TOOL FOR RESEARCH, 2010, : 81 - 89
  • [38] Mixture lognormal approximations to lognormal sum distributions
    Liu, Z.
    Almhana, J.
    Wang, F.
    McGorman, R.
    IEEE COMMUNICATIONS LETTERS, 2007, 11 (09) : 711 - 713
  • [39] Merging experts' opinions: A Bayesian hierarchical model with mixture of prior distributions
    Rufo, M. J.
    Perez, C. J.
    Martin, J.
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2010, 207 (01) : 284 - 289
  • [40] A finite mixture model of geometric distributions for lossless image compression
    Atef Masmoudi
    Slim Chaoui
    Afif Masmoudi
    Signal, Image and Video Processing, 2016, 10 : 671 - 678