Sparse and smooth: Improved guarantees for spectral clustering in the dynamic stochastic block model

被引:1
|
作者
Keriven, Nicolas [1 ,2 ]
Vaiter, Samuel [3 ,4 ]
机构
[1] CNRS, Paris, France
[2] GIPSA, St Martin Dheres, France
[3] Univ Cote dAzur, CNRS, Nice, France
[4] Univ Cote dAzur, LJAD, Nice, France
来源
ELECTRONIC JOURNAL OF STATISTICS | 2022年 / 16卷 / 01期
关键词
Dynamic network; dynamic Stochastic Block Model; spectral Clustering; concentration bounds; VARIATIONAL ESTIMATORS; COMMUNITY DETECTION; MAXIMUM-LIKELIHOOD; CONSISTENCY; BLOCKMODELS;
D O I
10.1214/22-EJS1986
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we analyze classical variants of the Spectral Clustering (SC) algorithm in the Dynamic Stochastic Block Model (DSBM). Existing results show that, in the relatively sparse case where the expected degree grows logarithmically with the number of nodes, guarantees in the static case can be extended to the dynamic case and yield improved error bounds when the DSBM is sufficiently smooth in time, that is, the communities do not change too much between two time steps. We improve over these results by drawing a new link between the sparsity and the smoothness of the DSBM: the smoother the DSBM is, the more sparse it can be, while still guaranteeing consistent recovery. In particular, a mild condition on the smoothness allows to treat the sparse case with bounded degree. These guarantees are valid for the SC applied to the adjacency matrix or the normalized Laplacian. As a by-product of our analysis, we obtain to our knowledge the best spectral concentration bound available for the normalized Laplacian of matrices with independent Bernoulli entries.
引用
收藏
页码:1330 / 1366
页数:37
相关论文
共 50 条
  • [21] Partial Recovery Bounds for the Sparse Stochastic Block Model
    Scarlett, Jonathan
    Cevher, Volkan
    2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 1904 - 1908
  • [22] Hypothesis testing in sparse weighted stochastic block model
    Mingao Yuan
    Fan Yang
    Zuofeng Shang
    Statistical Papers, 2022, 63 : 1051 - 1073
  • [23] Hypothesis testing in sparse weighted stochastic block model
    Yuan, Mingao
    Yang, Fan
    Shang, Zuofeng
    STATISTICAL PAPERS, 2022, 63 (04) : 1051 - 1073
  • [24] PageRank Nibble on the Sparse Directed Stochastic Block Model
    Banerjee, Sayan
    Deka, Prabhanka
    Olvera-Cravioto, Mariana
    ALGORITHMS AND MODELS FOR THE WEB GRAPH, WAW 2023, 2023, 13894 : 147 - 163
  • [25] Limiting spectral distribution of stochastic block model
    Su, Giap Van
    Chen, May-Ru
    Guo, Mei-Hui
    Huang, Hao-Wei
    RANDOM MATRICES-THEORY AND APPLICATIONS, 2023, 12 (04)
  • [26] An improved spectral clustering method for large-scale sparse networks
    Ding, Yi
    Deng, Jiayi
    Zhang, Bo
    STATISTICS AND ITS INTERFACE, 2025, 18 (02) : 257 - 266
  • [27] Preconditioned Spectral Clustering for Stochastic Block Partition Streaming Graph Challenge
    Zhuzhunashvili, David
    Knyazev, Andrew
    2017 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2017,
  • [28] Randomized Spectral Clustering in Large-Scale Stochastic Block Models
    Zhang, Hai
    Guo, Xiao
    Chang, Xiangyu
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (03) : 887 - 906
  • [29] Attributed graph clustering with subspace stochastic block model
    Chen, Haoran
    Yu, Zhongjing
    Yang, Qinli
    Shao, Junming
    INFORMATION SCIENCES, 2020, 535 (535) : 130 - 141
  • [30] Sparse Hypergraph Community Detection Thresholds in Stochastic Block Model
    Zhang, Erchuan
    Suter, David
    Truong, Giang
    Gilani, Syed Zulqarnain
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,