Genetic-based EM algorithm for learning Gaussian mixture models

被引:189
|
作者
Pernkopf, F
Bouchaffra, D
机构
[1] Univ Washington, Dept Elect Engn, Seattle, WA 98195 USA
[2] Graz Univ Technol, Lab Signal Proc & Speech Commun, A-8010 Graz, Austria
[3] Oakland Univ, Dept Comp Sci & Engn, Rochester, MI 48309 USA
关键词
unsupervised learning; clustering; Gaussian mixture models; EM algorithm; genetic algorithm; minimum description length;
D O I
10.1109/TPAMI.2005.162
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a genetic-based expectation-maximization (GA-EM) algorithm for learning Gaussian mixture models from multivariate data. This algorithm is capable of selecting the number of components of the model using the minimum description length (MDL) criterion. Our approach benefits from the properties of Genetic algorithms (GA) and the EM algorithm by combination of both into a single procedure. The population-based stochastic search of the GA explores the search space more thoroughly than the EM method. Therefore, our algorithm enables escaping from local optimal solutions since the algorithm becomes less sensitive to its initialization. The GA-EM algorithm is elitist which maintains the monotonic convergence property of the EM algorithm. The experiments on simulated and real data show that the GA-EM outperforms the EM method since: 1) We have obtained a better MDL score while using exactly the same termination condition for both algorithms. 2) Our approach identifies the number of components which were used to generate the underlying data more often than the EM algorithm.
引用
收藏
页码:1344 / 1348
页数:5
相关论文
共 50 条
  • [41] Theoretical guarantees for the EM algorithm when applied to mis-specified Gaussian mixture models
    Dwivedi, Raaz
    Ho, Nhat
    Khamaru, Koulik
    Wainwright, Martin J.
    Jordan, Michael I.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [42] Density-Preserving Hierarchical EM Algorithm: Simplifying Gaussian Mixture Models for Approximate Inference
    Yu, Lei
    Yang, Tianyu
    Chan, Antoni B.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (06) : 1323 - 1337
  • [43] An exact line search scheme to accelerate the EM algorithm: Application to Gaussian mixture models identification
    Xiang, Wentao
    Karfoul, Ahmad
    Yang, Chunfeng
    Shu, Huazhong
    Jeannes, Regine Le Bouquin
    JOURNAL OF COMPUTATIONAL SCIENCE, 2020, 41
  • [44] Choosing starting values for the EM algorithm for getting the highest likelihood in multivariate Gaussian mixture models
    Biernacki, C
    Celeux, G
    Govaert, G
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2003, 41 (3-4) : 561 - 575
  • [45] Competitive EM algorithm for finite mixture models
    Zhang, BB
    Zhang, CS
    Yi, X
    PATTERN RECOGNITION, 2004, 37 (01) : 131 - 144
  • [46] MIXTURE-MODELS, OUTLIERS, AND THE EM ALGORITHM
    AITKIN, M
    WILSON, GT
    TECHNOMETRICS, 1980, 22 (03) : 325 - 331
  • [47] Variational learning for Gaussian mixture models
    Nasios, Nikolaos
    Bors, Adrian G.
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (04): : 849 - 862
  • [48] Learning Gaussian Mixture Models With Entropy-Based Criteria
    Penalver Benavent, Antonio
    Escolano Ruiz, Francisco
    Manuel Saez, Juan
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (11): : 1756 - 1771
  • [49] A New Algorithm of Posture Modeling and Recognition Based on Gaussian Mixture Model and EM Estimation
    Wang, Chuanxu
    Yan, Chunjuan
    Zhang, Weijuan
    2009 INTERNATIONAL SYMPOSIUM ON INTELLIGENT INFORMATION SYSTEMS AND APPLICATIONS, PROCEEDINGS, 2009, : 365 - 368
  • [50] An online variant of EM algorithm based on the hierarchical mixture model learning
    Maebashi, K
    Suematsu, N
    Hayashi, A
    PROCEEDINGS OF THE IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND APPLICATIONS, VOLS 1AND 2, 2004, : 270 - 275