Enhanced Scalable Graph Neural Network via Knowledge Distillation

被引:0
|
作者
Mai, Chengyuan [1 ,2 ]
Chang, Yaomin [1 ,2 ]
Chen, Chuan [1 ,2 ]
Zheng, Zibin [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Natl Engn Res Ctr Digital Life, Guangzhou 510006, Peoples R China
[3] Sun Yat Sen Univ, Sch Software Engn, Zhuhai 519000, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Scalability; Computational modeling; Convolution; Training; Spectral analysis; Data mining; Graph neural network (GNN); knowledge distillation (KD); network embedding; scalability;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph representation learning scenarios. However, when applied to graph data in real world, GNNs have encountered scalability issues. Existing GNNs often have high computational load in both training and inference stages, making them incapable of meeting the performance needs of large-scale scenarios with a large number of nodes. Although several studies on scalable GNNs have developed, they either merely improve GNNs with limited scalability or come at the expense of reduced effectiveness. Inspired by knowledge distillation's (KDs) achievement in preserving performances while balancing scalability in computer vision and natural language processing, we propose an enhanced scalable GNN via KD (KD-SGNN) to improve the scalability and effectiveness of GNNs. On the one hand, KD-SGNN adopts the idea of decoupled GNNs, which decouples feature transformation and feature propagation in GNNs and leverages preprocessing techniques to improve the scalability of GNNs. On the other hand, KD-SGNN proposes two KD mechanisms (i.e., soft-target (ST) distillation and shallow imitation (SI) distillation) to improve the expressiveness. The scalability and effectiveness of KD-SGNN are evaluated on multiple real datasets. Besides, the effectiveness of the proposed KD mechanisms is also verified through comprehensive analyses.
引用
收藏
页码:1258 / 1271
页数:14
相关论文
共 50 条
  • [41] Persistence Enhanced Graph Neural Network
    Zhao, Qi
    Ye, Ze
    Chen, Chao
    Wang, Yusu
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2896 - 2905
  • [42] Scalable Graph Neural Networks via Bidirectional Propagation
    Chen, Ming
    Wei, Zhewei
    Ding, Bolin
    Li, Yaliang
    Yuan, Ye
    Du, Xiaoyong
    Wen, Ji-Rong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [43] Self-Guidance: Improve Deep Neural Network Generalization via Knowledge Distillation
    Zheng, Zhenzhu
    Peng, Xi
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 3451 - 3460
  • [44] Heterogeneous Graph Neural Network via Knowledge Relations for Fake News Detection
    Xie, Bingbing
    Ma, Xiaoxiao
    Wu, Jia
    Yang, Jian
    Xue, Shan
    Fan, Hao
    35TH INTERNATIONAL CONFERENCE ON SCIENTIFIC AND STATISTICAL DATABASE MANAGEMENT, SSDBM 2023, 2023,
  • [45] MAKE:Knowledge Graph Embedding via Multi-Attention neural network
    Liu, Denghui
    Wang, Yanna
    Zhou, Zili
    Dong, Zhaoan
    2022 5TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND NATURAL LANGUAGE PROCESSING, MLNLP 2022, 2022, : 347 - 352
  • [46] Heterogeneous graph knowledge distillation neural network incorporating multiple relations and cross-semantic interactions
    Fu, Jinhu
    Li, Chao
    Zhao, Zhongying
    Zeng, Qingtian
    INFORMATION SCIENCES, 2024, 658
  • [47] Knowledge graph enhanced neural collaborative recommendation
    Sang L.
    Xu M.
    Qian S.
    Wu X.
    Expert Systems with Applications, 2021, 164
  • [48] When Pansharpening Meets Graph Convolution Network and Knowledge Distillation
    Yan, Keyu
    Zhou, Man
    Liu, Liu
    Xie, Chengjun
    Hong, Danfeng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [49] When Pansharpening Meets Graph Convolution Network and Knowledge Distillation
    Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei
    230031, China
    不详
    230026, China
    不详
    200240, China
    不详
    230031, China
    不详
    100094, China
    IEEE Trans Geosci Remote Sens,
  • [50] Towards Continual Knowledge Graph Embedding via Incremental Distillation
    Liu, Jiajun
    Ke, Wenjun
    Wang, Peng
    Shang, Ziyu
    Gao, Jinhua
    Li, Guozheng
    Ji, Ke
    Liu, Yanhe
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 8, 2024, : 8759 - 8768