Hierarchical clustering with discrete latent variable models and the integrated classification likelihood

被引:0
|
作者
Etienne Côme
Nicolas Jouvin
Pierre Latouche
Charles Bouveyron
机构
[1] COSYS/GRETTIA,
[2] Université Gustave-Eiffel,undefined
[3] Université Paris 1 Panthéon-Sorbonne,undefined
[4] FP2M,undefined
[5] CNRS FR 2036,undefined
[6] Université de Paris,undefined
[7] MAP5,undefined
[8] CNRS,undefined
[9] Université Côte d’Azur,undefined
[10] CNRS,undefined
[11] Laboratoire J.A. Dieudonné,undefined
[12] Inria,undefined
[13] Maasai Research Team,undefined
关键词
Mixture models; Block modeling; Co-clustering; Genetic algorithm; Model-based;
D O I
暂无
中图分类号
学科分类号
摘要
Finding a set of nested partitions of a dataset is useful to uncover relevant structure at different scales, and is often dealt with a data-dependent methodology. In this paper, we introduce a general two-step methodology for model-based hierarchical clustering. Considering the integrated classification likelihood criterion as an objective function, this work applies to every discrete latent variable models (DLVMs) where this quantity is tractable. The first step of the methodology involves maximizing the criterion with respect to the partition. Addressing the known problem of sub-optimal local maxima found by greedy hill climbing heuristics, we introduce a new hybrid algorithm based on a genetic algorithm which allows to efficiently explore the space of solutions. The resulting algorithm carefully combines and merges different solutions, and allows the joint inference of the number K of clusters as well as the clusters themselves. Starting from this natural partition, the second step of the methodology is based on a bottom-up greedy procedure to extract a hierarchy of clusters. In a Bayesian context, this is achieved by considering the Dirichlet cluster proportion prior parameter α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document} as a regularization term controlling the granularity of the clustering. A new approximation of the criterion is derived as a log-linear function of α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document}, enabling a simple functional form of the merge decision criterion. This second step allows the exploration of the clustering at coarser scales. The proposed approach is compared with existing strategies on simulated as well as real settings, and its results are shown to be particularly relevant. A reference implementation of this work is available in the R-package greed accompanying the paper.
引用
收藏
页码:957 / 986
页数:29
相关论文
共 50 条
  • [1] Hierarchical clustering with discrete latent variable models and the integrated classification likelihood
    Come, Etienne
    Jouvin, Nicolas
    Latouche, Pierre
    Bouveyron, Charles
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2021, 15 (04) : 957 - 986
  • [2] Model selection for Gaussian latent block clustering with the integrated classification likelihood
    Aurore Lomet
    Gérard Govaert
    Yves Grandvalet
    Advances in Data Analysis and Classification, 2018, 12 : 489 - 508
  • [3] Model selection for Gaussian latent block clustering with the integrated classification likelihood
    Lomet, Aurore
    Govaert, Gerard
    Grandvalet, Yves
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2018, 12 (03) : 489 - 508
  • [4] Maximum likelihood estimation for discrete latent variable models via evolutionary algorithms
    Brusa, Luca
    Pennoni, Fulvia
    Bartolucci, Francesco
    STATISTICS AND COMPUTING, 2024, 34 (02)
  • [5] Discrete Latent Variable Models
    Bartolucci, Francesco
    Pandolfi, Silvia
    Pennoni, Fulvia
    ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, 2022, 9 : 425 - 452
  • [6] Decomposed Normalized Maximum Likelihood Codelength Criterion for Selecting Hierarchical Latent Variable Models
    Wu, Tianyi
    Sugawara, Shinya
    Yamanishi, Kenji
    KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, : 1165 - 1174
  • [7] Latent variable discovery in classification models
    Zhang, NL
    Nielsen, TD
    Jensen, FV
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2004, 30 (03) : 283 - 299
  • [8] Detecting Hierarchical Changes in Latent Variable Models
    Fukushima, Shintaro
    Yamanishi, Kenji
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 1028 - 1033
  • [9] An Approximation of the Integrated Classification Likelihood for the Latent Block Model
    Lomet, Aurore
    Govaert, Gerard
    Grandvalet, Yves
    12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2012), 2012, : 147 - 153
  • [10] Latent variable models for probabilistic graph clustering
    Lin, JK
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2004, 735 : 187 - 194