Variance Matrix Priors for Dirichlet Process Mixture Models With Gaussian Kernels

被引:1
|
作者
Jing, Wei [1 ]
Papathomas, Michail [1 ]
Liverani, Silvia [2 ,3 ]
机构
[1] Univ St Andrews, Sch Math & Stat, St Andrews, Scotland
[2] Queen Mary Univ London, Sch Math Sci, London, England
[3] Alan Turing Inst, British Lib, London, England
关键词
Bayesian non-parametrics; clustering; BAYESIAN VARIABLE SELECTION; PRIOR DISTRIBUTIONS; PROFILE REGRESSION; NUMBER; LASSO;
D O I
10.1111/insr.12595
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Bayesian mixture modelling is widely used for density estimation and clustering. The Dirichlet process mixture model (DPMM) is the most popular Bayesian non-parametric mixture modelling approach. In this manuscript, we study the choice of prior for the variance or precision matrix when Gaussian kernels are adopted. Typically, in the relevant literature, the assessment of mixture models is done by considering observations in a space of only a handful of dimensions. Instead, we are concerned with more realistic problems of higher dimensionality, in a space of up to 20 dimensions. We observe that the choice of prior is increasingly important as the dimensionality of the problem increases. After identifying certain undesirable properties of standard priors in problems of higher dimensionality, we review and implement possible alternative priors. The most promising priors are identified, as well as other factors that affect the convergence of MCMC samplers. Our results show that the choice of prior is critical for deriving reliable posterior inferences. This manuscript offers a thorough overview and comparative investigation into possible priors, with detailed guidelines for their implementation. Although our work focuses on the use of the DPMM in clustering, it is also applicable to density estimation.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] Spiked Dirichlet Process Priors for Gaussian Process Models
    Savitsky, Terrance
    Vannucci, Marina
    JOURNAL OF PROBABILITY AND STATISTICS, 2010, 2010
  • [2] Low Information Omnibus (LIO) Priors for Dirichlet Process Mixture Models
    Shi, Yushu
    Martens, Michael
    Banerjee, Anjishnu
    Laud, Purushottam
    BAYESIAN ANALYSIS, 2019, 14 (03): : 677 - 702
  • [3] Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution
    Dilan Görür
    Carl Edward Rasmussen
    Journal of Computer Science and Technology, 2010, 25 : 653 - 664
  • [4] Dirichlet Process Gaussian Mixture Models: Choice of the Base Distribution
    Goeruer, Dilan
    Rasmussen, Carl Edward
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2010, 25 (04) : 653 - 664
  • [5] Dirichlet Process Gaussian Mixture Models:Choice of the Base Distribution
    Dilan Grür
    Carl Edward Rasmussen
    JournalofComputerScience&Technology, 2010, 25 (04) : 653 - 664
  • [6] Additive mixed models with Dirichlet process mixture and P-spline priors
    Felix Heinzl
    Ludwig Fahrmeir
    Thomas Kneib
    AStA Advances in Statistical Analysis, 2012, 96 : 47 - 68
  • [7] Additive mixed models with Dirichlet process mixture and P-spline priors
    Heinzl, Felix
    Fahrmeir, Ludwig
    Kneib, Thomas
    ASTA-ADVANCES IN STATISTICAL ANALYSIS, 2012, 96 (01) : 47 - 68
  • [8] Matrix-Variate Dirichlet Process Priors with Applications
    Zhang, Zhihua
    Wang, Dakan
    Dai, Guang
    Jordan, Michael I.
    BAYESIAN ANALYSIS, 2014, 9 (02): : 259 - 285
  • [9] Information value in nonparametric Dirichlet-process Gaussian-process (DPGP) mixture models
    Wei, Hongchuan
    Lu, Wenjie
    Zhu, Pingping
    Ferrari, Silvia
    Liu, Miao
    Klein, Robert H.
    Omidshafiei, Shayegan
    How, Jonathan P.
    AUTOMATICA, 2016, 74 : 360 - 368
  • [10] Estimating mixture of Dirichlet process models
    MacEachern, SN
    Muller, P
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 1998, 7 (02) : 223 - 238