Adaptively Denoising Graph Neural Networks for Knowledge Distillation

被引:0
|
作者
Guo, Yuxin [1 ]
Yang, Cheng [1 ]
Shi, Chuan [1 ]
Tu, Ke [2 ]
Wu, Zhengwei [2 ]
Zhang, Zhiqiang [2 ]
Zhou, Jun [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing, Peoples R China
[2] Ant Financial, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph Neural Networks; Knowledge Distillation;
D O I
10.1007/978-3-031-70371-3_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) have excelled in various graph-based applications. Recently, knowledge distillation (KD) has provided a new approach to further boost GNNs performance. However, in the KD process, the GNN student may encounter noise issues while learning from GNN teacher and input graph. GNN teachers may carry noise as deep models inevitably introduce noise during training, leading to error propagation in GNN students. Besides, noisy structures in input graph may also disrupt information during message-passing in GNNs. Hence, we propose DKDG to adaptively remove noise in GNN teacher and graph structure for better distillation. DKDG comprises two modules: (1) teacher knowledge denoising module, which separates GNN teacher knowledge into noise and label knowledge, and removes parameters fitting noise knowledge in the GNN student. (2) graph structure denoising module is designed to enhance node representations discrimination. Detailly, we propose a discrimination-preserving objective based on total variation loss and update edge weights between adjacent nodes to minimize this objective. These two modules are integrated through GNN's forward propagation and trained iteratively. Experiments on five benchmark datasets and three GNNs demonstrate the GNN student distilled by DKDG gains 1.86% relative improvement compared to the best baseline of recent state-of-the-art GNN-based KD methods.
引用
收藏
页码:253 / 269
页数:17
相关论文
共 50 条
  • [41] Distilling Spikes: Knowledge Distillation in Spiking Neural Networks
    Kushawaha, Ravi Kumar
    Kumar, Saurabh
    Banerjee, Biplab
    Velmurugan, Rajbabu
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 4536 - 4543
  • [42] Distilling Holistic Knowledge with Graph Neural Networks
    Zhou, Sheng
    Wang, Yucheng
    Chen, Defang
    Chen, Jiawei
    Wang, Xin
    Wang, Can
    Bu, Jiajun
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10367 - 10376
  • [43] Adaptively Connected Neural Networks
    Wang, Guangrun
    Wang, Keze
    Lin, Liang
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 1781 - 1790
  • [44] Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation
    Xu, Qi
    Li, Yaxin
    Shen, Jiangrong
    Liu, Jian K.
    Tang, Huajin
    Pan, Gang
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7886 - 7895
  • [45] Reconstructed Graph Neural Network With Knowledge Distillation for Lightweight Anomaly Detection
    Zhou, Xiaokang
    Wu, Jiayi
    Liang, Wei
    Wang, Kevin I-Kai
    Yan, Zheng
    Yang, Laurence T.
    Jin, Qun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11817 - 11828
  • [46] Frameless Graph Knowledge Distillation
    Shi, Dai
    Shao, Zhiqi
    Gao, Junbin
    Wang, Zhiyong
    Guo, Yi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [47] Denoising Aggregation of Graph Neural Networks by Using Principal Component Analysis
    Dong, Wei
    Wozniak, Marcin
    Wu, Junsheng
    Li, Weigang
    Bai, Zongwen
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (03) : 2385 - 2394
  • [48] Graph Knowledge Transfer for Offensive Language Identification with Graph Neural Networks
    Huang, Yen-Hao
    Harryyanto, Kevin
    Tsai, Che-Wei
    Pornvattanavichai, Ratana
    Chen, Yi-Shin
    2022 IEEE 23RD INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE (IRI 2022), 2022, : 216 - 221
  • [49] Multi-Scale Distillation from Multiple Graph Neural Networks
    Zhang, Chunhai
    Liu, Jie
    Dang, Kai
    Zhang, Wenzheng
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4337 - 4344
  • [50] Automatic Modulation Classification with Neural Networks via Knowledge Distillation
    Wang, Shuai
    Liu, Chunwu
    ELECTRONICS, 2022, 11 (19)