FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation

被引:6
|
作者
Chen, Leiming [1 ]
Zhang, Weishan [1 ]
Dong, Cihao [1 ]
Zhao, Dehai [2 ]
Zeng, Xingjie [3 ]
Qiao, Sibo [4 ]
Zhu, Yichang [1 ]
Tan, Chee Wei [5 ]
机构
[1] China Univ Petr East China, Sch Comp Sci & Technol, Qingdao 266580, Peoples R China
[2] CSIRO Data61, Sydney 2015, Australia
[3] Southwest Petr Univ, Sch Comp Sci, Chengdu 610500, Peoples R China
[4] Tiangong Univ, Sch Software, Tianjin 300387, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
关键词
heterogeneous federated learning; adaptive knowledge distillation; malicious client identification; trustworthy knowledge aggregation;
D O I
10.3390/e26010096
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Federated learning allows multiple parties to train models while jointly protecting user privacy. However, traditional federated learning requires each client to have the same model structure to fuse the global model. In real-world scenarios, each client may need to develop personalized models based on its environment, making it difficult to perform federated learning in a heterogeneous model environment. Some knowledge distillation methods address the problem of heterogeneous model fusion to some extent. However, these methods assume that each client is trustworthy. Some clients may produce malicious or low-quality knowledge, making it difficult to aggregate trustworthy knowledge in a heterogeneous environment. To address these challenges, we propose a trustworthy heterogeneous federated learning framework (FedTKD) to achieve client identification and trustworthy knowledge fusion. Firstly, we propose a malicious client identification method based on client logit features, which can exclude malicious information in fusing global logit. Then, we propose a selectivity knowledge fusion method to achieve high-quality global logit computation. Additionally, we propose an adaptive knowledge distillation method to improve the accuracy of knowledge transfer from the server side to the client side. Finally, we design different attack and data distribution scenarios to validate our method. The experiment shows that our method outperforms the baseline methods, showing stable performance in all attack scenarios and achieving an accuracy improvement of 2% to 3% in different data distributions.
引用
收藏
页数:31
相关论文
共 50 条
  • [1] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [2] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [3] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [4] Heterogeneous Defect Prediction Based on Federated Transfer Learning via Knowledge Distillation
    Wang, Aili
    Zhang, Yutong
    Yan, Yixin
    IEEE ACCESS, 2021, 9 : 29530 - 29540
  • [5] FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation
    Tang, Jianwu
    Ding, Xuefeng
    Hu, Dasha
    Guo, Bing
    Shen, Yuncheng
    Ma, Pan
    Jiang, Yuming
    SENSORS, 2023, 23 (14)
  • [6] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [7] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [8] FEDGKD: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
    Yao, Dezhong
    Pan, Wanning
    Dai, Yutong
    Wan, Yao
    Ding, Xiaofeng
    Yu, Chen
    Jin, Hai
    Xu, Zheng
    Sun, Lichao
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (01) : 3 - 17
  • [9] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Xiucheng Wang
    Nan Cheng
    Longfei Ma
    Ruijin Sun
    Rong Chai
    Ning Lu
    China Communications, 2023, 20 (02) : 61 - 78
  • [10] Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning
    Chadha, Mohak
    Khera, Pulkit
    Gu, Jianfeng
    Abboud, Osama
    Gerndt, Michael
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 997 - 1006