Compressing Networks with Super Nodes

被引:0
|
作者
Natalie Stanley
Roland Kwitt
Marc Niethammer
Peter J. Mucha
机构
[1] University of North Carolina at Chapel Hill,Curriculum in Bioinformatics and Computational Biology
[2] University of Salzburg,Department of Computer Science
[3] University of North Carolina at Chapel Hill,Department of Computer Science
[4] University of North Carolina at Chapel Hill,Carolina Center for Interdisciplinary Applied Mathematics
来源
Scientific Reports | / 8卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Community detection is a commonly used technique for identifying groups in a network based on similarities in connectivity patterns. To facilitate community detection in large networks, we recast the network as a smaller network of ‘super nodes’, where each super node comprises one or more nodes of the original network. We can then use this super node representation as the input into standard community detection algorithms. To define the seeds, or centers, of our super nodes, we apply the ‘CoreHD’ ranking, a technique applied in network dismantling and decycling problems. We test our approach through the analysis of two common methods for community detection: modularity maximization with the Louvain algorithm and maximum likelihood optimization for fitting a stochastic block model. Our results highlight that applying community detection to the compressed network of super nodes is significantly faster while successfully producing partitions that are more aligned with the local network connectivity and more stable across multiple (stochastic) runs within and between community detection algorithms, yet still overlap well with the results obtained using the full network.
引用
收藏
相关论文
共 50 条
  • [41] Compressing PDF sets using generative adversarial networks
    Stefano Carrazza
    Juan Cruz-Martinez
    Tanjona R. Rabemananjara
    The European Physical Journal C, 2021, 81
  • [42] Exploiting Elasticity in Tensor Ranks for Compressing Neural Networks
    Ran, Jie
    Lin, Rui
    So, Hayden K. H.
    Chesi, Graziano
    Wong, Ngai
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9866 - 9873
  • [43] Compressing Neural Networks With Inter Prediction and Linear Transformation
    Lee, Kang-Ho
    Bae, Sung-Ho
    IEEE ACCESS, 2021, 9 : 69601 - 69608
  • [44] COMPRESSING DEEP NEURAL NETWORKS FOR EFFICIENT SPEECH ENHANCEMENT
    Tan, Ke
    Wang, DeLiang
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 8358 - 8362
  • [45] CUP: Cluster Pruning for Compressing Deep Neural Networks
    Duggal, Rahul
    Xiao, Cao
    Vuduc, Richard
    Duen Horng Chau
    Sun, Jimeng
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 5102 - 5106
  • [46] Compressing online social networks by calibrating structure redundancy
    Zhang, J. (zhangjianpei@hrbeu.edu.cn), 1600, Science Press (50):
  • [47] Edges, nodes and networks
    Mary Muers
    Nature Reviews Genetics, 2010, 11 : 4 - 5
  • [48] Of neighborhoods, networks, and nodes
    Rushhoff, D
    FUTURIST, 2006, 40 (01) : 40 - 41
  • [49] NETWORKS FIXED NODES
    HORN, WA
    AMERICAN MATHEMATICAL MONTHLY, 1972, 79 (03): : 309 - &
  • [50] Compressing Deep Neural Networks With Sparse Matrix Factorization
    Wu, Kailun
    Guo, Yiwen
    Zhang, Changshui
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) : 3828 - 3838