Bayesian mixture models (in)consistency for the number of clusters

被引:1
|
作者
Alamichel, Louise [1 ]
Bystrova, Daria [1 ,2 ]
Arbel, Julyan [1 ]
King, Guillaume Kon Kam [3 ]
机构
[1] Univ Grenoble Alpes, Inria, Grenoble INP, LJK,CNRS, Grenoble, France
[2] Univ Savoie Mont Blanc, CNRS, Lab Ecol Alpine, Univ Grenoble Alpes, Grenoble, France
[3] Univ Paris Saclay, INRAE, MaIAGE, Jouy En Josas, France
关键词
clustering; finite mixtures; finite-dimensional BNP representations; Gibbs-type process; GIBBS-TYPE PRIORS; PITMAN-YOR; NONPARAMETRIC-INFERENCE; DIRICHLET MIXTURES; DENSITY-ESTIMATION; CONVERGENCE-RATES; FINITE; CONSISTENCY;
D O I
10.1111/sjos.12739
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Bayesian nonparametric mixture models are common for modeling complex data. While these models are well-suited for density estimation, recent results proved posterior inconsistency of the number of clusters when the true number of components is finite, for the Dirichlet process and Pitman-Yor process mixture models. We extend these results to additional Bayesian nonparametric priors such as Gibbs-type processes and finite-dimensional representations thereof. The latter include the Dirichlet multinomial process, the recently proposed Pitman-Yor, and normalized generalized gamma multinomial processes. We show that mixture models based on these processes are also inconsistent in the number of clusters and discuss possible solutions. Notably, we show that a postprocessing algorithm introduced for the Dirichlet process can be extended to more general models and provides a consistent method to estimate the number of components.
引用
收藏
页码:1619 / 1660
页数:42
相关论文
共 50 条
  • [41] Scale mixture models with applications to Bayesian inference
    Qin, ZS
    Damien, P
    Walker, S
    MONTE CARLO METHOD IN THE PHYSICAL SCIENCES, 2003, 690 : 394 - 395
  • [42] Perceptual grouping as Bayesian estimation of mixture models
    Froyen, V.
    Feldman, J.
    Singh, M.
    PERCEPTION, 2013, 42 : 115 - 115
  • [43] Relabelling in Bayesian mixture models by pivotal units
    Egidi, Leonardo
    Pappada, Roberta
    Pauli, Francesco
    Torelli, Nicola
    STATISTICS AND COMPUTING, 2018, 28 (04) : 957 - 969
  • [44] On Bayesian Analysis of Parsimonious Gaussian Mixture Models
    Xiang Lu
    Yaoxiang Li
    Tanzy Love
    Journal of Classification, 2021, 38 : 576 - 593
  • [45] Bayesian mixture models for cytometry data analysis
    Lin, Lin
    Hejblum, Boris P.
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2021, 13 (04)
  • [46] Bayesian variable selection in Markov mixture models
    Paroli, Roberta
    Spezia, Luigi
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2008, 37 (01) : 25 - 47
  • [47] Full Bayesian inference with hazard mixture models
    Arbel, Julyan
    Lijoi, Antonio
    Nipoti, Bernardo
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2016, 93 : 359 - 372
  • [48] Incremental learning of nonparametric Bayesian mixture models
    Gomes, Ryan
    Welling, Max
    Perona, Pietro
    2008 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-12, 2008, : 227 - +
  • [49] Efficient Bayesian inference for dynamic mixture models
    Gerlach, R
    Carter, C
    Kohn, R
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2000, 95 (451) : 819 - 828
  • [50] Bayesian hierarchical mixture models for detecting non-normal clusters applied to noisy genomic and environmental datasets
    Zhang, Huizi
    Swallow, Ben
    Gupta, Mayetri
    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2022, 64 (02) : 313 - 337