On Compressing Social Networks

被引:0
|
作者
Chierichetti, Flavio [1 ]
Kumar, Ravi
Lattanzi, Silvio [1 ]
Mitzenmacher, Michael
Pancones, Alessandro [1 ]
Raghavan, Prabhakar
机构
[1] Sapienza Univ Rome, Dipartimento Informat, Rome, Italy
来源
KDD-09: 15TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING | 2009年
关键词
Compression; Social networks; Linear arrangement; Reciprocity;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Motivated by structural properties of the Web graph that support efficient data structures for in memory adjacency queries, we study the extent to which a large network can be compressed. Boldi and Vigna (WWW 2004), showed that Web graphs can be compressed down to three bits of storage per edge; we study the compressibility of social networks where again adjacency queries are a fundamental primitive. To this end, we propose simple combinatorial formulations that encapsulate efficient compressibility of graphs. We show that some of the problems are NP-hard yet admit effective heuristics, some of which can exploit properties of social networks such as link reciprocity. Our extensive experiments show that social networks and the Web graph exhibit vastly different compressibility characteristics.
引用
收藏
页码:219 / 227
页数:9
相关论文
共 50 条
  • [41] A Generalist Reinforcement Learning Agent for Compressing Convolutional Neural Networks
    Gonzalez-Sahagun, Gabriel
    Conant-Pablos, Santiago Enrique
    Ortiz-Bayliss, Jose Carlos
    Cruz-Duarte, Jorge M.
    IEEE ACCESS, 2024, 12 : 51100 - 51114
  • [42] Cross-Entropy Pruning for Compressing Convolutional Neural Networks
    Bao, Rongxin
    Yuan, Xu
    Chen, Zhikui
    Ma, Ruixin
    NEURAL COMPUTATION, 2018, 30 (11) : 3128 - 3149
  • [43] Compressing LSTM Networks with Hierarchical Coarse-Grain Sparsity
    Kadetotad, Deepak
    Meng, Jian
    Berisha, Visar
    Chakrabarti, Chaitali
    Seo, Jae-sun
    INTERSPEECH 2020, 2020, : 21 - 25
  • [44] Compressing Convolutional Neural Networks via Factorized Convolutional Filters
    Li, Tuanhui
    Wu, Baoyuan
    Yang, Yujiu
    Fan, Yanbo
    Zhang, Yong
    Liu, Wei
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 3972 - 3981
  • [45] A Unified Approximation Framework for Compressing and Accelerating Deep Neural Networks
    Ma, Yuzhe
    Chen, Ran
    Li, Wei
    Shang, Fanhua
    Yu, Wenjian
    Cho, Minsik
    Yu, Bei
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 376 - 383
  • [46] Research on method for compressing historical data based on wavelet networks
    Jiang, Peng
    Huang, Qingbo
    2006 INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND SECURITY, PTS 1 AND 2, PROCEEDINGS, 2006, : 1609 - 1613
  • [47] Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition
    Pan, Yu
    Xu, Jing
    Wang, Maolin
    Ye, Jinmian
    Wang, Fei
    Bai, Kun
    Xu, Zenglin
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4683 - 4690
  • [48] Compressing deep-quaternion neural networks with targeted regularisation
    Vecchi, Riccardo
    Scardapane, Simone
    Comminiello, Danilo
    Uncini, Aurelio
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2020, 5 (03) : 172 - 176
  • [49] Data aggregation ire Wireless Sensor Networks: Compressing or Forecasting?
    Cui, Jin
    Valois, Fabrice
    2014 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2014, : 2892 - 2897
  • [50] Compressing convolutional neural networks with cheap convolutions and online distillation
    Xie, Jiao
    Lin, Shaohui
    Zhang, Yichen
    Luo, Linkai
    DISPLAYS, 2023, 78