Maximum likelihood estimation of Gaussian mixture models without matrix operations

被引:0
|
作者
Hien D. Nguyen
Geoffrey J. McLachlan
机构
[1] University of Queensland,Department of Mathematics, School of Mathematics and Physics
关键词
Gaussian mixture model; Minorization–maximization algorithm; matrix operation-free; Linear Regression; 65C60; 62E10;
D O I
暂无
中图分类号
学科分类号
摘要
The Gaussian mixture model (GMM) is a popular tool for multivariate analysis, in particular, cluster analysis. The expectation–maximization (EM) algorithm is generally used to perform maximum likelihood (ML) estimation for GMMs due to the M-step existing in closed form and its desirable numerical properties, such as monotonicity. However, the EM algorithm has been criticized as being slow to converge and thus computationally expensive in some situations. In this article, we introduce the linear regression characterization (LRC) of the GMM. We show that the parameters of an LRC of the GMM can be mapped back to the natural parameters, and that a minorization–maximization (MM) algorithm can be constructed, which retains the desirable numerical properties of the EM algorithm, without the use of matrix operations. We prove that the ML estimators of the LRC parameters are consistent and asymptotically normal, like their natural counterparts. Furthermore, we show that the LRC allows for simple handling of singularities in the ML estimation of GMMs. Using numerical simulations in the R programming environment, we then demonstrate that the MM algorithm can be faster than the EM algorithm in various large data situations, where sample sizes range in the tens to hundreds of thousands and for estimating models with up to 16 mixture components on multivariate data with up to 16 variables.
引用
收藏
页码:371 / 394
页数:23
相关论文
共 50 条
  • [31] Maximum Likelihood Estimation for Matrix Normal Models via Quiver Representations
    Derksen, Harm
    Makam, Visu
    SIAM JOURNAL ON APPLIED ALGEBRA AND GEOMETRY, 2021, 5 (02): : 338 - 365
  • [32] On the Maximum Likelihood Estimation of a Covariance Matrix
    Tsai, Ming-Tien
    MATHEMATICAL METHODS OF STATISTICS, 2018, 27 (01) : 71 - 82
  • [33] Refined Convergence Rates for Maximum Likelihood Estimation under Finite Mixture Models
    Manole, Tudor
    Ho, Nhat
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [34] Maximum Likelihood Estimation of Semiparametric Mixture Component Models for Competing Risks Data
    Choi, Sangbum
    Huang, Xuelin
    BIOMETRICS, 2014, 70 (03) : 588 - 598
  • [35] Maximum Likelihood Estimation of Parameters in a Mixture Model
    Bhat, Satish
    Vidya, R.
    Parameshwar, V. Pandit
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2016, 45 (05) : 1776 - 1784
  • [36] Averaging, maximum penalized likelihood and Bayesian estimation for improving Gaussian mixture probability density estimates
    Ormoneit, D
    Tresp, V
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (04): : 639 - 650
  • [37] Improved methods for parameter estimation of mixture Gaussian model using genetic and maximum likelihood algorithms
    Nasab, NM
    Analoui, M
    Delp, EJ
    MEDICAL IMAGING 2004: IMAGE PROCESSING, PTS 1-3, 2004, 5370 : 566 - 576
  • [38] ESTIMATION OF THE VARIANCE FOR THE MAXIMUM LIKELIHOOD ESTIMATES IN NORMAL MIXTURE MODELS AND NORMAL HIDDEN MARKOV MODELS
    Iqbal, Muhammad
    Nishi, Akihiro
    Kikuchi, Yasuki
    Nomakuchi, Kentaro
    JOURNAL JAPANESE SOCIETY OF COMPUTATIONAL STATISTICS, 2011, 24 (01): : 39 - 66
  • [39] Maximum Likelihood Robust Regression by Mixture Models
    Sami S. Brandt
    Journal of Mathematical Imaging and Vision, 2006, 25 : 25 - 48
  • [40] Maximum likelihood robust regression by mixture models
    Brandt, Sami S.
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2006, 25 (01) : 25 - 48