Hierarchical clustering with discrete latent variable models and the integrated classification likelihood

被引:0
|
作者
Etienne Côme
Nicolas Jouvin
Pierre Latouche
Charles Bouveyron
机构
[1] COSYS/GRETTIA,
[2] Université Gustave-Eiffel,undefined
[3] Université Paris 1 Panthéon-Sorbonne,undefined
[4] FP2M,undefined
[5] CNRS FR 2036,undefined
[6] Université de Paris,undefined
[7] MAP5,undefined
[8] CNRS,undefined
[9] Université Côte d’Azur,undefined
[10] CNRS,undefined
[11] Laboratoire J.A. Dieudonné,undefined
[12] Inria,undefined
[13] Maasai Research Team,undefined
关键词
Mixture models; Block modeling; Co-clustering; Genetic algorithm; Model-based;
D O I
暂无
中图分类号
学科分类号
摘要
Finding a set of nested partitions of a dataset is useful to uncover relevant structure at different scales, and is often dealt with a data-dependent methodology. In this paper, we introduce a general two-step methodology for model-based hierarchical clustering. Considering the integrated classification likelihood criterion as an objective function, this work applies to every discrete latent variable models (DLVMs) where this quantity is tractable. The first step of the methodology involves maximizing the criterion with respect to the partition. Addressing the known problem of sub-optimal local maxima found by greedy hill climbing heuristics, we introduce a new hybrid algorithm based on a genetic algorithm which allows to efficiently explore the space of solutions. The resulting algorithm carefully combines and merges different solutions, and allows the joint inference of the number K of clusters as well as the clusters themselves. Starting from this natural partition, the second step of the methodology is based on a bottom-up greedy procedure to extract a hierarchy of clusters. In a Bayesian context, this is achieved by considering the Dirichlet cluster proportion prior parameter α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document} as a regularization term controlling the granularity of the clustering. A new approximation of the criterion is derived as a log-linear function of α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document}, enabling a simple functional form of the merge decision criterion. This second step allows the exploration of the clustering at coarser scales. The proposed approach is compared with existing strategies on simulated as well as real settings, and its results are shown to be particularly relevant. A reference implementation of this work is available in the R-package greed accompanying the paper.
引用
收藏
页码:957 / 986
页数:29
相关论文
共 50 条
  • [31] Likelihood based hierarchical clustering
    Castro, RM
    Coates, MJ
    Nowak, RD
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) : 2308 - 2321
  • [32] Sequential Dynamic Classification Using Latent Variable Models
    Lee, Seung Min
    Roberts, Stephen J.
    COMPUTER JOURNAL, 2010, 53 (09): : 1415 - 1429
  • [33] A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses
    Vassilis G. S. Vasdekis
    Silvia Cagnone
    Irini Moustaki
    Psychometrika, 2012, 77 : 425 - 441
  • [34] Understanding Masked Autoencoders via Hierarchical Latent Variable Models
    Kong, Lingjing
    Ma, Martin Q.
    Chen, Guangyi
    Xing, Eric P.
    Chi, Yuejie
    Morency, Louis-Philippe
    Zhang, Kun
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7918 - 7928
  • [35] A Composite Likelihood Inference in Latent Variable Models for Ordinal Longitudinal Responses
    Vasdekis, Vassilis G. S.
    Cagnone, Silvia
    Moustaki, Irini
    PSYCHOMETRIKA, 2012, 77 (03) : 425 - 441
  • [36] Asymptotic properties of adaptive maximum likelihood estimators in latent variable models
    Bianconcini, Silvia
    BERNOULLI, 2014, 20 (03) : 1507 - 1531
  • [37] On the convergence of the Monte Carlo maximum likelihood method for latent variable models
    Cappé, O
    Douc, R
    Moulines, E
    Robert, C
    SCANDINAVIAN JOURNAL OF STATISTICS, 2002, 29 (04) : 615 - 635
  • [38] A COMPARISON OF DISCRETE LATENT VARIABLE MODELS FOR SPEECH REPRESENTATION LEARNING
    Zhou, Henry
    Baevski, Alexei
    Auli, Michael
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3050 - 3054
  • [39] Searching for valid psychiatric phenotypes: discrete latent variable models
    Leoutsakos, Jeannie-Marie S.
    Zandi, Peter P.
    Bandeen-Roche, Karen
    Lyketsos, Constantine G.
    INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, 2010, 19 (02) : 63 - 73
  • [40] Fair Inference for Discrete Latent Variable Models: An Intersectional Approach
    Islam, Rashidul
    Pan, Shimei
    Foulds, James R.
    PROCEEDINGS OF THE 2024 INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY FOR SOCIAL GOOD, GOODIT 2024, 2024, : 188 - 196