Variance Matrix Priors for Dirichlet Process Mixture Models With Gaussian Kernels

被引:1
|
作者
Jing, Wei [1 ]
Papathomas, Michail [1 ]
Liverani, Silvia [2 ,3 ]
机构
[1] Univ St Andrews, Sch Math & Stat, St Andrews, Scotland
[2] Queen Mary Univ London, Sch Math Sci, London, England
[3] Alan Turing Inst, British Lib, London, England
关键词
Bayesian non-parametrics; clustering; BAYESIAN VARIABLE SELECTION; PRIOR DISTRIBUTIONS; PROFILE REGRESSION; NUMBER; LASSO;
D O I
10.1111/insr.12595
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Bayesian mixture modelling is widely used for density estimation and clustering. The Dirichlet process mixture model (DPMM) is the most popular Bayesian non-parametric mixture modelling approach. In this manuscript, we study the choice of prior for the variance or precision matrix when Gaussian kernels are adopted. Typically, in the relevant literature, the assessment of mixture models is done by considering observations in a space of only a handful of dimensions. Instead, we are concerned with more realistic problems of higher dimensionality, in a space of up to 20 dimensions. We observe that the choice of prior is increasingly important as the dimensionality of the problem increases. After identifying certain undesirable properties of standard priors in problems of higher dimensionality, we review and implement possible alternative priors. The most promising priors are identified, as well as other factors that affect the convergence of MCMC samplers. Our results show that the choice of prior is critical for deriving reliable posterior inferences. This manuscript offers a thorough overview and comparative investigation into possible priors, with detailed guidelines for their implementation. Although our work focuses on the use of the DPMM in clustering, it is also applicable to density estimation.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] Distributed Inference for Dirichlet Process Mixture Models
    Ge, Hong
    Chen, Yutian
    Wan, Moquan
    Ghahramani, Zoubin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 2276 - 2284
  • [22] DIRICHLET PROCESS MIXTURE MODELS WITH MULTIPLE MODALITIES
    Paisley, John
    Carin, Lawrence
    2009 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1- 8, PROCEEDINGS, 2009, : 1613 - 1616
  • [23] Background Subtraction with Dirichlet Process Mixture Models
    Haines, Tom S. F.
    Xiang, Tao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2014, 36 (04) : 670 - 683
  • [24] Collapsed Variational Dirichlet Process Mixture Models
    Kurihara, Kenichi
    Welling, Max
    Teh, Yee Whye
    20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2007, : 2796 - 2801
  • [26] Improving Bayesian radiological profiling of waste drums using Dirichlet priors, Gaussian process priors, and hierarchical modeling
    Laloy, Eric
    Rogiers, Bart
    Bielen, An
    Borella, Alessandro
    Boden, Sven
    APPLIED RADIATION AND ISOTOPES, 2023, 194
  • [27] Study on hybrid sampling inference for dirichlet process mixture of Gaussian process model
    Lei, Ju-Yang
    Huang, Ke
    Xu, Hai-Xiang
    Shi, Xi-Zhi
    Shanghai Jiaotong Daxue Xuebao/Journal of Shanghai Jiaotong University, 2010, 44 (02): : 271 - 275
  • [28] Dirichlet Process Gaussian Mixture Models for Real-Time Monitoring and Their Application to Chemical Mechanical Planarization
    Liu, Jia
    Beyca, Omer F.
    Rao, Prahalad K.
    Kong, Zhenyu
    Bukkapatnam, Satish T. S.
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2017, 14 (01) : 208 - 221
  • [29] (H)DPGMM: a hierarchy of Dirichlet process Gaussian mixture models for the inference of the black hole mass function
    Rinaldi, Stefano
    Del Pozzo, Walter
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2021, 509 (04) : 5454 - 5466
  • [30] (H)DPGMM: a hierarchy of Dirichlet process Gaussian mixture models for the inference of the black hole mass function
    Rinaldi, Stefano
    Del Pozzo, Walter
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2022, 509 (04) : 5454 - 5466