Compressing Networks with Super Nodes

被引:0
|
作者
Natalie Stanley
Roland Kwitt
Marc Niethammer
Peter J. Mucha
机构
[1] University of North Carolina at Chapel Hill,Curriculum in Bioinformatics and Computational Biology
[2] University of Salzburg,Department of Computer Science
[3] University of North Carolina at Chapel Hill,Department of Computer Science
[4] University of North Carolina at Chapel Hill,Carolina Center for Interdisciplinary Applied Mathematics
来源
Scientific Reports | / 8卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Community detection is a commonly used technique for identifying groups in a network based on similarities in connectivity patterns. To facilitate community detection in large networks, we recast the network as a smaller network of ‘super nodes’, where each super node comprises one or more nodes of the original network. We can then use this super node representation as the input into standard community detection algorithms. To define the seeds, or centers, of our super nodes, we apply the ‘CoreHD’ ranking, a technique applied in network dismantling and decycling problems. We test our approach through the analysis of two common methods for community detection: modularity maximization with the Louvain algorithm and maximum likelihood optimization for fitting a stochastic block model. Our results highlight that applying community detection to the compressed network of super nodes is significantly faster while successfully producing partitions that are more aligned with the local network connectivity and more stable across multiple (stochastic) runs within and between community detection algorithms, yet still overlap well with the results obtained using the full network.
引用
收藏
相关论文
共 50 条
  • [31] Power Optimized DSTBC Assisted DMF Relaying in Wireless Sensor Networks with Redundant Super Nodes
    Razi, Abolfazl
    Afghah, Fatemeh
    Abedi, Ali
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2013, 12 (02) : 636 - 645
  • [32] Compressing Convolutional Neural Networks in the Frequency Domain
    Chen, Wenlin
    Wilson, James
    Tyree, Stephen
    Weinberger, Kilian Q.
    Chen, Yixin
    KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 1475 - 1484
  • [33] CHEBYSHEV PHASE NETWORKS FOR PULSE COMPRESSING AND STRETCHING
    DJURICH, BM
    PETKOVICH, RA
    DJURDJANOVICH, OD
    IEE PROCEEDINGS-G CIRCUITS DEVICES AND SYSTEMS, 1990, 137 (06): : 424 - 426
  • [34] SUPER GLUE FOR SUPER NETWORKS
    MAHONEY, DC
    ANDREW SEYBOLDS OUTLOOK ON PROFESSIONAL COMPUTING, 1987, 6 (05): : 19 - 19
  • [35] Distributed Data Compressing Based on Stable Ratio between WSN Nodes Outputs
    Liu Shaoqiang
    Yan Zhenyan
    Fan Xiaoping
    Li Yongzhou
    Liu Limin
    2013 32ND CHINESE CONTROL CONFERENCE (CCC), 2013, : 7489 - 7493
  • [36] Compressing neural networks with two-layer decoupling
    De Jonghe, Joppe
    Usevich, Konstantin
    Dreesen, Philippe
    Ishteva, Mariya
    2023 IEEE 9TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, CAMSAP, 2023, : 226 - 230
  • [37] Heatmap centrality: A new measure to identify super-spreader nodes in scale-free networks
    Duron, Christina
    PLOS ONE, 2020, 15 (07):
  • [38] Partitioning Networks with Node Attributes by Compressing Information Flow
    Smith, Laura M.
    Zhu, Linhong
    Lerman, Kristina
    Percus, Allon G.
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2016, 11 (02)
  • [39] Anonymous Model Pruning for Compressing Deep Neural Networks
    Zhang, Lechun
    Chen, Guangyao
    Shi, Yemin
    Zhang, Quan
    Tan, Mingkui
    Wang, Yaowei
    Tian, Yonghong
    Huang, Tiejun
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 161 - 164
  • [40] Transferring and Compressing Convolutional Neural Networks for Face Representations
    Grundstrom, Jakob
    Chen, Jiandan
    Ljungqvist, Martin Georg
    Astrom, Kalle
    IMAGE ANALYSIS AND RECOGNITION (ICIAR 2016), 2016, 9730 : 20 - 29