On initialization of Gaussian mixtures: A hybrid genetic EM algorithm

被引:0
|
作者
Pernkopf, F [1 ]
机构
[1] Graz Univ Technol, Lab Signal Proc & Speech Commun, A-8010 Graz, Austria
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a genetic-based expectation-maximization (GA-EM) algorithm for learning Gaussian mixture models from multivariate data. This algorithm is capable of selecting the number of components of the model using the minimum description length (MDL) criterion. We combine EM and GA into a single procedure. The population-based stochastic search of the GA explores the search space more thoroughly than the EM method. Therefore, our algorithm enables to escape from local optimal solutions since the algorithm becomes less sensitive to its initialization. The GA-EM algorithm is elitist which maintains the monotonic convergence property of the EM algorithm. The experiments show that the GA-EM outperforms the EM method since: (i) We have obtained a better MDL score while using exactly the same initialization and termination condition for both algorithms. (ii) Our approach identifies the number of components which were used to generate the underlying data more often as the EM algorithm.
引用
收藏
页码:693 / 696
页数:4
相关论文
共 50 条
  • [41] Diffusion-Based EM Algorithm for Distributed Estimation of Gaussian Mixtures in Wireless Sensor Networks
    Weng, Yang
    Xiao, Wendong
    Xie, Lihua
    SENSORS, 2011, 11 (06) : 6297 - 6316
  • [42] Multiscale modeling for classification of SAR imagery using hybrid EM algorithm and genetic algorithm
    Wen, Xianbin
    Zhang, Hua
    Zhang, Jianguang
    Jiao, Xu
    Wang, Lei
    PROGRESS IN NATURAL SCIENCE, 2009, 19 (08) : 1033 - 1036
  • [45] Improved Initialization of the EM Algorithm for Mixture Model Parameter Estimation
    Panic, Branislav
    Klemenc, Jernej
    Nagode, Marko
    MATHEMATICS, 2020, 8 (03)
  • [46] EM algorithm with PIP initialization and temperature-based selection
    Ishikawa, Yuta
    Nakano, Ryohei
    KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS, PT 3, PROCEEDINGS, 2008, 5179 : 58 - 66
  • [47] HIERARCHICAL MIXTURES OF EXPERTS AND THE EM ALGORITHM
    JORDAN, MI
    JACOBS, RA
    NEURAL COMPUTATION, 1994, 6 (02) : 181 - 214
  • [48] On convergence and parameter selection of the EM and DA-EM algorithms for Gaussian mixtures
    Yu, Jian
    Chaomurilige, Chaomu
    Yang, Miin-Shen
    PATTERN RECOGNITION, 2018, 77 : 188 - 203
  • [49] Research on correct convergence of the EM algorihm for Gaussian mixtures
    Fu, SQ
    Cao, BY
    Ma, JW
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 2660 - 2664
  • [50] A Greedy EM Algorithm for Gaussian Mixture Learning
    Nikos Vlassis
    Aristidis Likas
    Neural Processing Letters, 2002, 15 : 77 - 87