FedTweet: Two-fold Knowledge Distillation for non-IID Federated Learning

被引:1
|
作者
Wang, Yanhan [1 ,2 ]
Wang, Wenting [3 ]
Wang, Xin [1 ,2 ]
Zhang, Heng [4 ]
Wu, Xiaoming [1 ,2 ]
Yang, Ming [1 ,2 ]
机构
[1] Qilu Univ Technol, Shandong Acad Sci, Shandong Comp Sci Ctr, Key Lab Comp Power Network & Informat Secur,Minist, Jinan 250014, Peoples R China
[2] Shandong Fundamental Res Ctr Comp Sci, Shandong Prov Key Lab Comp Networks, Jinan 250014, Peoples R China
[3] State Grid Shandong Elect Power Res Inst, Jinan 250003, Peoples R China
[4] Jiangsu Ocean Univ, Sch Comp Engn, Lianyungang 222005, Peoples R China
关键词
Federated learning; Non-IID data; Knowledge distillation; Adversarial training; PRIVACY;
D O I
10.1016/j.compeleceng.2023.109067
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) is a distributed learning approach that allows each client to retain its original data locally and share only the parameters of the local updates with the server. While FL can mitigate the problem of "data islands", the training process involving nonindependent and identically distributed (non-IID) data still faces the formidable challenge of model performance degradation due to "client drift"in practical applications. To address this challenge, in this paper, we design a novel approach termed "Two-fold Knowledge Distillation for non-IID Federated Learning"(FedTweet), meticulously designed for the personalized training of both local and global models within various heterogeneous data contexts. Specifically, the server employs global pseudo -data for fine-tuning the initial aggregated model through knowledge distillation and adopts dynamic aggregation weights for local generators based on model similarity to ensure diversity in global pseudo -data. Clients freeze the received global model as a teacher model and conduct adversarial training between the local model and local generator, thus preserving the personalized information in the local updates while correcting their directions. FedTweet enables both global and local models to serve as teacher models for each other, ensuring bidirectional guarantees for personalization and generalization. Finally, extensive experiments conducted on benchmark datasets demonstrate that FedTweet outperforms several previous FL methods on heterogeneous datasets.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Non-IID Federated Learning
    Cao, Longbing
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (02) : 14 - 15
  • [2] Communication-Efficient Federated Learning on Non-IID Data Using Two-Step Knowledge Distillation
    Wen, Hui
    Wu, Yue
    Hu, Jia
    Wang, Zi
    Duan, Hancong
    Min, Geyong
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (19) : 17307 - 17322
  • [3] FedKT: Federated learning with knowledge transfer for non-IID data
    Mao, Wenjie
    Yu, Bin
    Zhang, Chen
    Qin, A. K.
    Xie, Yu
    PATTERN RECOGNITION, 2025, 159
  • [4] Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data
    He, Yuting
    Chen, Yiqiang
    Yang, XiaoDong
    Yu, Hanchao
    Huang, Yi-Hua
    Gu, Yang
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (06) : 789 - 800
  • [5] Federated learning on non-IID data: A survey
    Zhu, Hangyu
    Xu, Jinjin
    Liu, Shiqing
    Jin, Yaochu
    NEUROCOMPUTING, 2021, 465 : 371 - 390
  • [6] Knowledge Discrepancy-Aware Federated Learning for Non-IID Data
    Shen, Jianhua
    Chen, Siguang
    2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,
  • [7] Knowledge-Aware Federated Active Learning with Non-IID Data
    Cao, Yu-Tong
    Shi, Ye
    Yu, Baosheng
    Wang, Jingya
    Tao, Dacheng
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 22222 - 22232
  • [8] Federated Learning on Non-IID Graphs via Structural Knowledge Sharing
    Tan, Yue
    Liu, Yixin
    Long, Guodong
    Jiang, Jing
    Lu, Qinghua
    Zhang, Chengqi
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9953 - 9961
  • [9] Adaptive Federated Learning With Non-IID Data
    Zeng, Yan
    Mu, Yuankai
    Yuan, Junfeng
    Teng, Siyuan
    Zhang, Jilin
    Wan, Jian
    Ren, Yongjian
    Zhang, Yunquan
    COMPUTER JOURNAL, 2023, 66 (11): : 2758 - 2772
  • [10] Federated Learning With Taskonomy for Non-IID Data
    Jamali-Rad, Hadi
    Abdizadeh, Mohammad
    Singh, Anuj
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8719 - 8730