FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning

被引:0
|
作者
Zhu, Yinlin [1 ]
Lie, Xunkai [2 ]
Wu, Zhengyu [2 ]
Wu, Di [1 ]
Hu, Miao [1 ]
Li, Rong-Hua [2 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Peoples R China
[2] Beijing Inst Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Subgraph federated learning (subgraph-FL) is a new distributed paradigm that facilitates the collaborative training of graph neural networks (GNNs) by multi-client subgraphs. Unfortunately, a significant challenge of subgraph-FL arises from subgraph heterogeneity, which stems from node and topology variation, causing the impaired performance of the global GNN. Despite various studies, they have not yet thoroughly investigated the impact mechanism of subgraph heterogeneity. To this end, we decouple node and topology variation, revealing that they correspond to differences in label distribution and structure homophily. Remarkably, these variations lead to significant differences in the class-wise knowledge reliability of multiple local GNNs, misguiding the model aggregation with varying degrees. Building on this insight, we propose topology-aware data-free knowledge distillation technology (FedTAD), enhancing reliable knowledge transfer from the local model to the global model. Extensive experiments on six public datasets consistently demonstrate the superiority of FedTAD over state-of-the-art baselines.
引用
收藏
页码:5716 / 5724
页数:9
相关论文
共 50 条
  • [1] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [2] A Category-Aware Curriculum Learning for Data-Free Knowledge Distillation
    Li, Xiufang
    Jiao, Licheng
    Sun, Qigong
    Liu, Fang
    Liu, Xu
    Li, Lingling
    Chen, Puhua
    Yang, Shuyuan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 9603 - 9618
  • [3] DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning
    Luo, Kangyang
    Wang, Shuai
    Fu, Yexuan
    Li, Xiang
    Lan, Yunshi
    Gao, Ming
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] Variational Data-Free Knowledge Distillation for Continual Learning
    Li, Xiaorong
    Wang, Shipeng
    Sun, Jian
    Xu, Zongben
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12618 - 12634
  • [5] Data-free knowledge distillation via generator-free data generation for Non-IID federated learning
    Zhao, Siran
    Liao, Tianchi
    Fu, Lele
    Chen, Chuan
    Bian, Jing
    Zheng, Zibin
    NEURAL NETWORKS, 2024, 179
  • [6] FedGTA: Topology-aware Averaging for Federated Graph Learning
    Li, Xunkai
    Wu, Zhengyu
    Zhang, Wentao
    Zhu, Yinlin
    Li, Rong-Hua
    Wang, Guoren
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2023, 17 (01): : 41 - 50
  • [7] Topology-aware Federated Learning in Edge Computing: A Comprehensive Survey
    Wu, Jiajun
    Dong, Fan
    Leung, Henry
    Zhu, Zhuangdi
    Zhou, Jiayu
    Drew, Steve
    ACM COMPUTING SURVEYS, 2024, 56 (10)
  • [8] Robust Heterogeneous Federated Learning via Data-Free Knowledge Amalgamation
    Ma, Jun
    Fan, Zheng
    Fan, Chaoyu
    Kang, Qi
    ADVANCES IN SWARM INTELLIGENCE, PT II, ICSI 2024, 2024, 14789 : 61 - 71
  • [9] Conditional generative data-free knowledge distillation
    Yu, Xinyi
    Yan, Ling
    Yang, Yang
    Zhou, Libo
    Ou, Linlin
    IMAGE AND VISION COMPUTING, 2023, 131
  • [10] Data-free Knowledge Distillation for Object Detection
    Chawla, Akshay
    Yin, Hongxu
    Molchanov, Pavlo
    Alvarez, Jose
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3288 - 3297