FedTweet: Two-fold Knowledge Distillation for non-IID Federated Learning

被引:1
|
作者
Wang, Yanhan [1 ,2 ]
Wang, Wenting [3 ]
Wang, Xin [1 ,2 ]
Zhang, Heng [4 ]
Wu, Xiaoming [1 ,2 ]
Yang, Ming [1 ,2 ]
机构
[1] Qilu Univ Technol, Shandong Acad Sci, Shandong Comp Sci Ctr, Key Lab Comp Power Network & Informat Secur,Minist, Jinan 250014, Peoples R China
[2] Shandong Fundamental Res Ctr Comp Sci, Shandong Prov Key Lab Comp Networks, Jinan 250014, Peoples R China
[3] State Grid Shandong Elect Power Res Inst, Jinan 250003, Peoples R China
[4] Jiangsu Ocean Univ, Sch Comp Engn, Lianyungang 222005, Peoples R China
关键词
Federated learning; Non-IID data; Knowledge distillation; Adversarial training; PRIVACY;
D O I
10.1016/j.compeleceng.2023.109067
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) is a distributed learning approach that allows each client to retain its original data locally and share only the parameters of the local updates with the server. While FL can mitigate the problem of "data islands", the training process involving nonindependent and identically distributed (non-IID) data still faces the formidable challenge of model performance degradation due to "client drift"in practical applications. To address this challenge, in this paper, we design a novel approach termed "Two-fold Knowledge Distillation for non-IID Federated Learning"(FedTweet), meticulously designed for the personalized training of both local and global models within various heterogeneous data contexts. Specifically, the server employs global pseudo -data for fine-tuning the initial aggregated model through knowledge distillation and adopts dynamic aggregation weights for local generators based on model similarity to ensure diversity in global pseudo -data. Clients freeze the received global model as a teacher model and conduct adversarial training between the local model and local generator, thus preserving the personalized information in the local updates while correcting their directions. FedTweet enables both global and local models to serve as teacher models for each other, ensuring bidirectional guarantees for personalization and generalization. Finally, extensive experiments conducted on benchmark datasets demonstrate that FedTweet outperforms several previous FL methods on heterogeneous datasets.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Non-IID Federated Learning With Sharper Risk Bound
    Wei, Bojian
    Li, Jian
    Liu, Yong
    Wang, Weiping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (05) : 6906 - 6917
  • [22] Adaptive Federated Deep Learning With Non-IID Data
    Zhang, Ze-Hui
    Li, Qing-Dan
    Fu, Yao
    He, Ning-Xin
    Gao, Tie-Gang
    Zidonghua Xuebao/Acta Automatica Sinica, 2023, 49 (12): : 2493 - 2506
  • [23] A Novel Approach for Federated Learning with Non-IID Data
    Nguyen, Hiep
    Warrier, Harikrishna
    Gupta, Yogesh
    2022 9TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2022, : 62 - 67
  • [24] Federated Dictionary Learning from Non-IID Data
    Gkillas, Alexandros
    Ampeliotis, Dimitris
    Berberidis, Kostas
    2022 IEEE 14TH IMAGE, VIDEO, AND MULTIDIMENSIONAL SIGNAL PROCESSING WORKSHOP (IVMSP), 2022,
  • [25] EFL: ELASTIC FEDERATED LEARNING ON NON-IID DATA
    Ma, Zichen
    Lu, Yu
    Li, Wenye
    Cui, Shuguang
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
  • [26] Dual Adversarial Federated Learning on Non-IID Data
    Zhang, Tao
    Yang, Shaojing
    Song, Anxiao
    Li, Guangxia
    Dong, Xuewen
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III, 2022, 13370 : 233 - 246
  • [27] Decoupled Federated Learning for ASR with Non-IID Data
    Zhu, Han
    Wang, Jindong
    Cheng, Gaofeng
    Zhang, Pengyuan
    Yan, Yonghong
    INTERSPEECH 2022, 2022, : 2628 - 2632
  • [28] FedEL: Federated ensemble learning for non-iid data
    Wu, Xing
    Pei, Jie
    Han, Xian-Hua
    Chen, Yen-Wei
    Yao, Junfeng
    Liu, Yang
    Qian, Quan
    Guo, Yike
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [29] Contractible Regularization for Federated Learning on Non-IID Data
    Chen, Zifan
    Wu, Zhe
    Wu, Xian
    Zhang, Li
    Zhao, Jie
    Yan, Yangtian
    Zheng, Yefeng
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 61 - 70
  • [30] Optimizing Federated Learning on Non-IID Data with Reinforcement Learning
    Wang, Hao
    Kaplan, Zakhary
    Niu, Di
    Li, Baochun
    IEEE INFOCOM 2020 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2020, : 1698 - 1707