Enhanced Scalable Graph Neural Network via Knowledge Distillation

被引:0
|
作者
Mai, Chengyuan [1 ,2 ]
Chang, Yaomin [1 ,2 ]
Chen, Chuan [1 ,2 ]
Zheng, Zibin [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Natl Engn Res Ctr Digital Life, Guangzhou 510006, Peoples R China
[3] Sun Yat Sen Univ, Sch Software Engn, Zhuhai 519000, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Scalability; Computational modeling; Convolution; Training; Spectral analysis; Data mining; Graph neural network (GNN); knowledge distillation (KD); network embedding; scalability;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph representation learning scenarios. However, when applied to graph data in real world, GNNs have encountered scalability issues. Existing GNNs often have high computational load in both training and inference stages, making them incapable of meeting the performance needs of large-scale scenarios with a large number of nodes. Although several studies on scalable GNNs have developed, they either merely improve GNNs with limited scalability or come at the expense of reduced effectiveness. Inspired by knowledge distillation's (KDs) achievement in preserving performances while balancing scalability in computer vision and natural language processing, we propose an enhanced scalable GNN via KD (KD-SGNN) to improve the scalability and effectiveness of GNNs. On the one hand, KD-SGNN adopts the idea of decoupled GNNs, which decouples feature transformation and feature propagation in GNNs and leverages preprocessing techniques to improve the scalability of GNNs. On the other hand, KD-SGNN proposes two KD mechanisms (i.e., soft-target (ST) distillation and shallow imitation (SI) distillation) to improve the expressiveness. The scalability and effectiveness of KD-SGNN are evaluated on multiple real datasets. Besides, the effectiveness of the proposed KD mechanisms is also verified through comprehensive analyses.
引用
收藏
页码:1258 / 1271
页数:14
相关论文
共 50 条
  • [31] SMIGNN: social recommendation with multi-intention knowledge distillation based on graph neural network
    Yong Niu
    Xing Xing
    Zhichun Jia
    Mindong Xin
    Junye Xing
    The Journal of Supercomputing, 2024, 80 : 6965 - 6988
  • [32] Multi-view knowledge graph fusion via knowledge-aware attentional graph neural network
    Zhichao Huang
    Xutao Li
    Yunming Ye
    Baoquan Zhang
    Guangning Xu
    Wensheng Gan
    Applied Intelligence, 2023, 53 : 3652 - 3671
  • [33] SMIGNN: social recommendation with multi-intention knowledge distillation based on graph neural network
    Niu, Yong
    Xing, Xing
    Jia, Zhichun
    Xin, Mindong
    Xing, Junye
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (05): : 6965 - 6988
  • [34] Knowledge Concept Recommender Based on Structure Enhanced Interaction Graph Neural Network
    Ling, Yu
    Shan, Zhilong
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2022, 13368 : 173 - 186
  • [35] An External Knowledge Enhanced Graph-Based Neural Network for Sentence Ordering
    Yin, Yongjing
    Lai, Shaopeng
    Song, Linfeng
    Zhou, Chulun
    Han, Xianpei
    Yao, Junfeng
    Su, Jinsong
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2021, 70 : 545 - 566
  • [36] Graph Intention Neural Network for Knowledge Graph Reasoning
    Jiang, Weihao
    Fu, Yao
    Zhao, Hong
    Wan, Junhong
    Pu, Shiliang
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [37] Knowledge-Enhanced Personalized Review Generation with Capsule Graph Neural Network
    Li, Junyi
    Li, Siqing
    Zhao, Wayne Xin
    He, Gaole
    Wei, Zhicheng
    Yuan, Nicholas Jing
    Wen, Ji-Rong
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 735 - 744
  • [38] An External Knowledge Enhanced Graph-Based Neural Network for Sentence Ordering
    Yin Y.
    Lai S.
    Song L.
    Zhou C.
    Han X.
    Yao J.
    Su J.
    Journal of Artificial Intelligence Research, 2021, 70 : 545 - 566
  • [39] Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection
    Zheng, Qinyue
    Venkitaraman, Arun
    Petravic, Simona
    Frossard, Pascal
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 547 - 563
  • [40] Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks
    Yang, Chenxiao
    Wu, Qitian
    Yan, Junchi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,