Hierarchical Bayes based Adaptive Sparsity in Gaussian Mixture Model

被引:0
|
作者
Wang, Binghui [1 ]
Lin, Chuang [1 ,2 ]
Fan, Xin [1 ]
Jiang, Ning [2 ]
Farina, Dario [2 ]
机构
[1] Dalian Univ Technol, Sch Software, Dalian, Peoples R China
[2] Univ Gottingen, Univ Med Ctr Goettingen, Dept Neurorehabil Engn, D-37073 Gottingen, Germany
基金
欧洲研究理事会;
关键词
High-dimensional parameter estimation; Hierarchical Bayes; Adaptive sparsity; GMM; COVARIANCE-MATRIX ESTIMATION; CONVERGENCE; SELECTION; RATES;
D O I
10.1016/j.patrec.2014.07.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian Mixture Model (GMM) has been widely used in statistics for its great flexibility. However, parameter estimation for GMM with high dimensionality is a challenge because of the large number of parameters and the lack of observation data. In this paper, we propose an effective method named hierarchical Bayes based Adaptive Sparsity in Gaussian Mixture Model (ASGMM) to estimate the parameters in a GMM by incorporating a two-layer hierarchical Bayes based adaptive sparsity prior. The prior we impose on the precision matrices can encourage sparsity and hence reduce the dimensionality of the parameters to be estimated. In contrast to the l(1)-norm penalty or Laplace prior, our approach does not involve any hyperparameters that must be tuned, and the sparsity adapts to the observation data. The proposed method is achieved by three steps: first, we formulate an adaptive hierarchical Bayes model of the precision matrices in the GMM with a Jeffrey's noninformative hyperprior, which expresses scale-invariance and, more importantly, is hyperparameter-free and unbiased. Second, we perform a Cholesky decomposition on the precision matrices to impose the positive definite property. Finally, we exploit the expectation maximization (EM) algorithm to obtain the final estimated parameters in the GMM. Experimental results on synthetic and real-world datasets demonstrate that ASGMM cannot only adapt the sparsity of high-dimensional data with small estimated error, but also achieve better clustering performance comparing with several classical methods. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:238 / 247
页数:10
相关论文
共 50 条
  • [21] EEG-based drivers' drowsiness monitoring using a hierarchical Gaussian mixture model
    Rosipal, Roman
    Peters, Bjorn
    Kecklund, Goran
    Akerstedt, Torbjorn
    Gruber, Georg
    Woertz, Michael
    Anderer, Peter
    Dorffner, Georg
    FOUNDATIONS OF AUGMENTED COGNITION, PROCEEDINGS, 2007, 4565 : 294 - +
  • [22] An Integrated Hierarchical Gaussian Mixture Model to Estimate Vigilance Level Based on EEG Recordings
    Gu, Jing-Nan
    Liu, Hong-Jun
    Lu, Hong-Tao
    Lu, Bao-Liang
    NEURAL INFORMATION PROCESSING, PT I, 2011, 7062 : 380 - 387
  • [23] Hierarchical Gaussian Mixture based Task Generative Model for Robust Meta-Learning
    Zhang, Yizhou
    Ni, Jingchao
    Cheng, Wei
    Chen, Zhengzhang
    Tong, Liang
    Chen, Haifeng
    Liu, Yan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [24] HGMMC: A Space Target Detection Algorithm Based on Hierarchical Gaussian Mixture Model Clustering
    Chen, Qian
    Wei, Yuheng
    Wei, Xinguo
    IEEE SENSORS JOURNAL, 2024, 24 (24) : 41623 - 41634
  • [25] A GAUSSIAN MIXTURE MODEL-BASED REGULARIZATION METHOD IN ADAPTIVE IMAGE RESTORATION
    Liu Peng Zhang Yan Mao Zhigang (Shenzhen Graduate School
    JournalofElectronics(China), 2007, (01) : 83 - 89
  • [26] Image denoising method based on adaptive Gaussian mixture model in wavelet domain
    College of Communication Engineering, Jilin University, Changchun 130022, China
    Jilin Daxue Xuebao (Gongxueban), 2006, 6 (983-988):
  • [27] Image denoising based on local-adaptive Gaussian scale mixture model
    Shaanxi Key Lab. of Information Acquisition and Processing, School of Electronics and Information, Northwestern Polytechnical Univ., Xi'an 710129, China
    Xi Tong Cheng Yu Dian Zi Ji Shu/Syst Eng Electron, 2009, 12 (2806-2808):
  • [28] Adaptive Maximum Correntropy Gaussian Filter Based on Variational Bayes
    Wang, Guoqing
    Gao, Zhongxing
    Zhang, Yonggang
    Ma, Bin
    SENSORS, 2018, 18 (06)
  • [29] Hierarchical Gaussian Mixture Model for Image Annotation via PLSA
    Wang, Zhiyong
    Yi, Huaibin
    Wang, Jiajun
    Feng, Dagan
    PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON IMAGE AND GRAPHICS (ICIG 2009), 2009, : 384 - 389
  • [30] A novel adaptive Gaussian mixture model for background subtraction
    Cheng, J
    Yang, J
    Zhou, Y
    PATTERN RECOGNITION AND IMAGE ANALYSIS, PT 1, PROCEEDINGS, 2005, 3522 : 587 - 593