FedEKT: Ensemble Knowledge Transfer for Model-Heterogeneous Federated Learning

被引:0
|
作者
Wu, Meihan [1 ]
Li, Li [2 ]
Chang, Tao [1 ]
Qiao, Peng [1 ]
Miao, Cui [1 ]
Zhou, Jie [1 ]
Wang, Jingnan [1 ]
Wang, Xiaodong [1 ]
机构
[1] Natl Univ Def Technol, Changsha, Peoples R China
[2] Univ Macau, Macau, Peoples R China
关键词
federated learning; model heterogeneity; knowledge transfer;
D O I
10.1109/IWQoS61813.2024.10682872
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables multiple clients to collaboratively train a shared server model while preserving data privacy. Most existing FL systems rely on the assumption that the server model and client models have homogeneous architecture. However, intensive resource requirements during the training process prevent low-end devices from contributing to the server model with their own data. On the other hand, the resource constraints on participating clients can significantly limit the size of the server model in the model-homogeneous setting, thereby restricting the application scope of FL. In this work, we propose FedEKT, a novel model-heterogeneous FL system designed to obtain a high-performance large server model while benefiting heterogeneous small client models. Specifically, a new aggregation approach is designed to enable the integration of knowledge from heterogeneous client models to a large server model while mitigating the adverse effects of biases stemming from data heterogeneity. Subsequently, to enhance the performance of client models by benefiting from the high-performance server model, FedEKT distills this large server model into multiple heterogeneous client models, facilitating the transfer of integrated knowledge back to the client models. In addition, we design specialized modules within the model and communication strategy to accomplish aggregation and transfer of knowledge in a data-free manner. The evaluation results demonstrate that FedEKT enhances the accuracy of the server model and client models by up to 53.96% and 12.35%, respectively, compared with the state-of-the-art FL approach on CIFAR-100.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation
    Gong, Xuan
    Sharma, Abhishek
    Karanam, Srikrishna
    Wu, Ziyan
    Chen, Terrence
    Doermann, David
    Innanje, Arun
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11891 - 11899
  • [42] A novel federated learning approach with knowledge transfer for credit scoring
    Wang, Zhongyi
    Xiao, Jin
    Wang, Lu
    Yao, Jianrong
    DECISION SUPPORT SYSTEMS, 2024, 177
  • [43] Decentralized Two-Stage Federated Learning with Knowledge Transfer
    Jin, Tong
    Chen, Siguang
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3181 - 3186
  • [44] Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge
    He, Chaoyang
    Annavaram, Murali
    Avestimehr, Salman
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [45] Hybrid IoT Device Selection With Knowledge Transfer for Federated Learning
    Dang, Qianlong
    Zhang, Guanghui
    Wang, Ling
    Yang, Shuai
    Zhan, Tao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (07) : 12216 - 12227
  • [46] Heterogeneous Model Fusion Federated Learning Mechanism Based on Model Mapping
    Lu, Xiaofeng
    Liao, Yuying
    Liu, Chao
    Lio, Pietro
    Hui, Pan
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (08): : 6058 - 6068
  • [47] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [48] Towards Personalized Federated Learning via Heterogeneous Model Reassembly
    Wang, Jiaqi
    Yang, Xingyi
    Cui, Suhan
    Che, Liwei
    Lyu, Lingjuan
    Xu, Dongkuan
    Ma, Fenglong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [49] FedEqual: Defending Model Poisoning Attacks in Heterogeneous Federated Learning
    Chen, Ling-Yuan
    Chiu, Te-Chuan
    Pang, Ai-Chun
    Cheng, Li-Chen
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [50] Communication Efficient Heterogeneous Federated Learning based on Model Similarity
    Li, Zhaojie
    Ohtsuki, Tomoaki
    Gui, Guan
    2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,