FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning

被引:0
|
作者
Zhu, Yinlin [1 ]
Lie, Xunkai [2 ]
Wu, Zhengyu [2 ]
Wu, Di [1 ]
Hu, Miao [1 ]
Li, Rong-Hua [2 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Peoples R China
[2] Beijing Inst Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Subgraph federated learning (subgraph-FL) is a new distributed paradigm that facilitates the collaborative training of graph neural networks (GNNs) by multi-client subgraphs. Unfortunately, a significant challenge of subgraph-FL arises from subgraph heterogeneity, which stems from node and topology variation, causing the impaired performance of the global GNN. Despite various studies, they have not yet thoroughly investigated the impact mechanism of subgraph heterogeneity. To this end, we decouple node and topology variation, revealing that they correspond to differences in label distribution and structure homophily. Remarkably, these variations lead to significant differences in the class-wise knowledge reliability of multiple local GNNs, misguiding the model aggregation with varying degrees. Building on this insight, we propose topology-aware data-free knowledge distillation technology (FedTAD), enhancing reliable knowledge transfer from the local model to the global model. Extensive experiments on six public datasets consistently demonstrate the superiority of FedTAD over state-of-the-art baselines.
引用
收藏
页码:5716 / 5724
页数:9
相关论文
共 50 条
  • [21] Data-free adaptive structured pruning for federated learning
    Fan, Wei
    Yang, Keke
    Wang, Yifan
    Chen, Cong
    Li, Jing
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (13): : 18600 - 18626
  • [22] Data-Free Evaluation of User Contributions in Federated Learning
    Lv, Hongtao
    Zheng, Zhenzhe
    Luo, Tie
    Wu, Fan
    Tang, Shaojie
    Hua, Lifeng
    Jie, Rongfei
    Lv, Chengfei
    2021 19TH INTERNATIONAL SYMPOSIUM ON MODELING AND OPTIMIZATION IN MOBILE, AD HOC, AND WIRELESS NETWORKS (WIOPT), 2021,
  • [23] Interaction Subgraph Sequential Topology-Aware Network for Transferable Recommendation
    Yang, Kang
    Yu, Ruiyun
    Guo, Bingyang
    Li, Jie
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (10) : 5221 - 5233
  • [24] Memory efficient data-free distillation for continual learning
    Li, Xiaorong
    Wang, Shipeng
    Sun, Jian
    Xu, Zongben
    PATTERN RECOGNITION, 2023, 144
  • [25] Data-Free Knowledge Distillation For Image Super-Resolution
    Zhang, Yiman
    Chen, Hanting
    Chen, Xinghao
    Deng, Yiping
    Xu, Chunjing
    Wang, Yunhe
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 7848 - 7857
  • [26] Up to 100 x Faster Data-Free Knowledge Distillation
    Fang, Gongfan
    Mo, Kanya
    Wang, Xinchao
    Song, Jie
    Bei, Shitao
    Zhang, Haofei
    Song, Mingli
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6597 - 6604
  • [27] Double-Generators Network for Data-Free Knowledge Distillation
    Zhang J.
    Ju J.
    Ren Y.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (07): : 1615 - 1627
  • [28] Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation
    Nayak, Gaurav Kumar
    Mopuri, Konda Reddy
    Chakraborty, Anirban
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 1429 - 1437
  • [29] Unpacking the Gap Box Against Data-Free Knowledge Distillation
    Wang, Yang
    Qian, Biao
    Liu, Haipeng
    Rui, Yong
    Wang, Meng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (09) : 6280 - 6291
  • [30] Data-free Knowledge Distillation based on GNN for Node Classification
    Zeng, Xinfeng
    Liu, Tao
    Zeng, Ming
    Wu, Qingqiang
    Wang, Meihong
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2024, PT 2, 2025, 14851 : 243 - 258