FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning

被引:0
|
作者
Zhu, Yinlin [1 ]
Lie, Xunkai [2 ]
Wu, Zhengyu [2 ]
Wu, Di [1 ]
Hu, Miao [1 ]
Li, Rong-Hua [2 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Peoples R China
[2] Beijing Inst Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Subgraph federated learning (subgraph-FL) is a new distributed paradigm that facilitates the collaborative training of graph neural networks (GNNs) by multi-client subgraphs. Unfortunately, a significant challenge of subgraph-FL arises from subgraph heterogeneity, which stems from node and topology variation, causing the impaired performance of the global GNN. Despite various studies, they have not yet thoroughly investigated the impact mechanism of subgraph heterogeneity. To this end, we decouple node and topology variation, revealing that they correspond to differences in label distribution and structure homophily. Remarkably, these variations lead to significant differences in the class-wise knowledge reliability of multiple local GNNs, misguiding the model aggregation with varying degrees. Building on this insight, we propose topology-aware data-free knowledge distillation technology (FedTAD), enhancing reliable knowledge transfer from the local model to the global model. Extensive experiments on six public datasets consistently demonstrate the superiority of FedTAD over state-of-the-art baselines.
引用
收藏
页码:5716 / 5724
页数:9
相关论文
共 50 条
  • [31] FedGhost: Data-Free Model Poisoning Enhancement in Federated Learning
    Ma, Zhuoran
    Huang, Xinyi
    Wang, Zhuzhu
    Qin, Zhan
    Wang, Xiangyu
    Ma, Jianfeng
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2025, 20 : 2096 - 2108
  • [32] DENSE: Data-Free One-Shot Federated Learning
    Zhang, Jie
    Chen, Chen
    Li, Bo
    Lyu, Lingjuan
    Wu, Shuang
    Ding, Shouhong
    Shen, Chunhua
    Wu, Chao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [33] Topology-Aware Indexing System for Urban Knowledge
    Antonini, Alessio
    Boella, Guido
    Buccoliero, Stefania
    Lupi, Lucia
    Schifanella, Claudio
    2017 COMPUTING CONFERENCE, 2017, : 1003 - 1010
  • [34] Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
    Do, Kien
    Le, Hung
    Dung Nguyen
    Dang Nguyen
    Harikumar, Haripriya
    Truyen Tran
    Rana, Santu
    Venkatesh, Svetha
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [35] D3K: Dynastic Data-Free Knowledge Distillation
    Li, Xiufang
    Sun, Qigong
    Jiao, Licheng
    Liu, Fang
    Liu, Xu
    Li, Lingling
    Chen, Puhua
    Zuo, Yi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 8358 - 8371
  • [36] Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis
    Wang, Zi
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10245 - 10253
  • [37] Conditional pseudo-supervised contrast for data-Free knowledge distillation
    Shao, Renrong
    Zhang, Wei
    Wang, Jun
    PATTERN RECOGNITION, 2023, 143
  • [38] Data-free Knowledge Distillation for Fine-grained Visual Categorization
    Shao, Renrong
    Zhang, Wei
    Yin, Jianhua
    Wang, Jun
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 1515 - 1525
  • [39] A Network Resource Aware Federated Learning Approach using Knowledge Distillation
    Mishra, Rahul
    Gupta, Hari Prabhat
    Dutta, Tanima
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,
  • [40] Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation
    Patel, Gaurav
    Mopuri, Konda Reddy
    Qiu, Qiang
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7786 - 7794