Variance Matrix Priors for Dirichlet Process Mixture Models With Gaussian Kernels

被引:1
|
作者
Jing, Wei [1 ]
Papathomas, Michail [1 ]
Liverani, Silvia [2 ,3 ]
机构
[1] Univ St Andrews, Sch Math & Stat, St Andrews, Scotland
[2] Queen Mary Univ London, Sch Math Sci, London, England
[3] Alan Turing Inst, British Lib, London, England
关键词
Bayesian non-parametrics; clustering; BAYESIAN VARIABLE SELECTION; PRIOR DISTRIBUTIONS; PROFILE REGRESSION; NUMBER; LASSO;
D O I
10.1111/insr.12595
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Bayesian mixture modelling is widely used for density estimation and clustering. The Dirichlet process mixture model (DPMM) is the most popular Bayesian non-parametric mixture modelling approach. In this manuscript, we study the choice of prior for the variance or precision matrix when Gaussian kernels are adopted. Typically, in the relevant literature, the assessment of mixture models is done by considering observations in a space of only a handful of dimensions. Instead, we are concerned with more realistic problems of higher dimensionality, in a space of up to 20 dimensions. We observe that the choice of prior is increasingly important as the dimensionality of the problem increases. After identifying certain undesirable properties of standard priors in problems of higher dimensionality, we review and implement possible alternative priors. The most promising priors are identified, as well as other factors that affect the convergence of MCMC samplers. Our results show that the choice of prior is critical for deriving reliable posterior inferences. This manuscript offers a thorough overview and comparative investigation into possible priors, with detailed guidelines for their implementation. Although our work focuses on the use of the DPMM in clustering, it is also applicable to density estimation.
引用
收藏
页数:25
相关论文
共 50 条
  • [41] Analysis of the Maximal a Posteriori Partition in the Gaussian Dirichlet Process Mixture Model
    Rajkowski, Lukasz
    BAYESIAN ANALYSIS, 2019, 14 (02): : 477 - 494
  • [42] Image Segmentation Using a Spatially Correlated Mixture Model with Gaussian Process Priors
    Kurisu, Kosei
    Suematsu, Nobuo
    Iwata, Kazunori
    Hayashi, Akira
    2013 SECOND IAPR ASIAN CONFERENCE ON PATTERN RECOGNITION (ACPR 2013), 2013, : 59 - 63
  • [43] DIRICHLET PROCESS PRIORS AND THE SECRETARY PROBLEM
    STEWART, TJ
    SOUTH AFRICAN STATISTICAL JOURNAL, 1987, 21 (02) : 193 - 194
  • [44] Dirichlet process mixture of Gaussian process functional regressions and its variational EM algorithm
    Li, Tao
    Ma, Jinwen
    PATTERN RECOGNITION, 2022, 134
  • [45] Mixtures of Gaussian process priors
    Lemm, JC
    NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN99), VOLS 1 AND 2, 1999, (470): : 292 - 297
  • [46] Transformations of Gaussian process priors
    Murray-Smith, R
    Pearlmutter, BA
    DETERMINISTIC AND STATISTICAL METHODS IN MACHINE LEARNING, 2005, 3635 : 110 - 123
  • [47] On the Inference of Dirichlet Mixture Priors for Protein Sequence Comparison
    Ye, Xugang
    Yu, Yi-Kuo
    Altschul, Stephen F.
    JOURNAL OF COMPUTATIONAL BIOLOGY, 2011, 18 (08) : 941 - 954
  • [48] Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies
    Savitsky, Terrance
    Vannucci, Marina
    Sha, Naijun
    STATISTICAL SCIENCE, 2011, 26 (01) : 130 - 149
  • [49] Scalable Computation of Predictive Probabilities in Probit Models with Gaussian Process Priors
    Cao, Jian
    Durante, Daniele
    Genton, Marc G.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (03) : 709 - 720
  • [50] How priors of initial hyperparameters affect Gaussian process regression models
    Chen, Zexun
    Wang, Bo
    NEUROCOMPUTING, 2018, 275 : 1702 - 1710