A NON ASYMPTOTIC PENALIZED CRITERION FOR GAUSSIAN MIXTURE MODEL SELECTION

被引:30
|
作者
Maugis, Cathy [1 ]
Michel, Bertrand [2 ]
机构
[1] Univ Toulouse, Inst Math Toulouse, INSA Toulouse, F-31077 Toulouse 4, France
[2] Univ Paris 06, Lab Stat Theor & Appl, F-75013 Paris, France
关键词
Model-based clustering; variable selection; penalized likelihood criterion; bracketing entropy; MAXIMUM-LIKELIHOOD; CONVERGENCE; RATES;
D O I
10.1051/ps/2009004
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Specific Gaussian mixtures are considered to solve simultaneously variable selection and clustering problems. A non asymptotic penalized criterion is proposed to choose the number of mixture components and the relevant variable subset. Because of the non linearity of the associated Kullback-Leibler contrast on Gaussian mixtures, a general model selection theorem for maximum likelihood estimation proposed by [Massart Concentration inequalities and model selection Springer, Berlin (2007). Lectures from the 33rd Summer School on Probability Theory held in Saint-Flour, July 6-23 (2003)] is used to obtain the penalty function form. This theorem requires to control the bracketing entropy of Gaussian mixture families. The ordered and non-ordered variable selection cases are both addressed in this paper.
引用
收藏
页码:41 / 68
页数:28
相关论文
共 50 条
  • [41] Unsupervised Segmentation of Spectral Images with a Spatialized Gaussian Mixture Model and Model Selection
    Cohen, S. X.
    Le Pennec, E.
    OIL AND GAS SCIENCE AND TECHNOLOGY-REVUE D IFP ENERGIES NOUVELLES, 2014, 69 (02): : 245 - 259
  • [42] Collaborative penalized Gaussian mixture PHD tracker for close target tracking
    Wang, Yan
    Meng, Huadong
    Liu, Yimin
    Wang, Xiqin
    SIGNAL PROCESSING, 2014, 102 : 1 - 15
  • [43] An iterative algorithm for BYY learning on Gaussian mixture with automated model selection
    Ma, JW
    Wang, TJ
    Xu, L
    PROCEEDINGS OF 2003 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS & SIGNAL PROCESSING, PROCEEDINGS, VOLS 1 AND 2, 2003, : 7 - 10
  • [44] A generalized competitive learning algorithm on Gaussian mixture with automatic model selection
    Lu, Zhiwu
    Lu, Xiaoqing
    ROUGH SETS AND KNOWLEDGE TECHNOLOGY, PROCEEDINGS, 2006, 4062 : 560 - 567
  • [45] Gaussian Mixture Model Selection Using Multiple Random Subsampling with Initialization
    Psutka, Josef V.
    COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2015, PT I, 2015, 9256 : 678 - 689
  • [46] The BYY annealing learning algorithm for Gaussian mixture with automated model selection
    Ma, Jinwen
    Liu, Jianfeng
    PATTERN RECOGNITION, 2007, 40 (07) : 2029 - 2037
  • [47] Voice Activity Detection Based on Sequential Gaussian Mixture Model with Maximum Likelihood Criterion
    Shen, Zhan
    Wei, Jianguo
    Lu, Wenhuan
    Dang, Jianwu
    2016 10TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING (ISCSLP), 2016,
  • [48] A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
    Nguyen, TrungTin
    Nguyen, Hien Duy
    Chamroukhi, Faicel
    Forbes, Florence
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (02): : 4742 - 4822
  • [49] Learning from Inconsistent and Unreliable Annotators by a Gaussian Mixture Model and Bayesian Information Criterion
    Zhang, Ping
    Obradovic, Zoran
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2011, 6913 : 553 - 568
  • [50] Cluster number selection using finite mixture model and penalized Fisher class separability measure
    Wang, Xudong
    Syrrnos, Vassilis L.
    2007 AMERICAN CONTROL CONFERENCE, VOLS 1-13, 2007, : 4160 - +