FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation

被引:6
|
作者
Chen, Leiming [1 ]
Zhang, Weishan [1 ]
Dong, Cihao [1 ]
Zhao, Dehai [2 ]
Zeng, Xingjie [3 ]
Qiao, Sibo [4 ]
Zhu, Yichang [1 ]
Tan, Chee Wei [5 ]
机构
[1] China Univ Petr East China, Sch Comp Sci & Technol, Qingdao 266580, Peoples R China
[2] CSIRO Data61, Sydney 2015, Australia
[3] Southwest Petr Univ, Sch Comp Sci, Chengdu 610500, Peoples R China
[4] Tiangong Univ, Sch Software, Tianjin 300387, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
关键词
heterogeneous federated learning; adaptive knowledge distillation; malicious client identification; trustworthy knowledge aggregation;
D O I
10.3390/e26010096
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Federated learning allows multiple parties to train models while jointly protecting user privacy. However, traditional federated learning requires each client to have the same model structure to fuse the global model. In real-world scenarios, each client may need to develop personalized models based on its environment, making it difficult to perform federated learning in a heterogeneous model environment. Some knowledge distillation methods address the problem of heterogeneous model fusion to some extent. However, these methods assume that each client is trustworthy. Some clients may produce malicious or low-quality knowledge, making it difficult to aggregate trustworthy knowledge in a heterogeneous environment. To address these challenges, we propose a trustworthy heterogeneous federated learning framework (FedTKD) to achieve client identification and trustworthy knowledge fusion. Firstly, we propose a malicious client identification method based on client logit features, which can exclude malicious information in fusing global logit. Then, we propose a selectivity knowledge fusion method to achieve high-quality global logit computation. Additionally, we propose an adaptive knowledge distillation method to improve the accuracy of knowledge transfer from the server side to the client side. Finally, we design different attack and data distribution scenarios to validate our method. The experiment shows that our method outperforms the baseline methods, showing stable performance in all attack scenarios and achieving an accuracy improvement of 2% to 3% in different data distributions.
引用
收藏
页数:31
相关论文
共 50 条
  • [41] Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
    Mohammed, Malik Naik
    Zhang, Xinyue
    Valero, Maria
    Xie, Ying
    2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, : 207 - 208
  • [42] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
  • [43] FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
    Han, Sungwon
    Park, Sungwon
    Wu, Fangzhao
    Kim, Sundong
    Wu, Chuhan
    Xie, Xing
    Cha, Meeyoung
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 691 - 707
  • [44] Incentive and Knowledge Distillation Based Federated Learning for Cross-Silo Applications
    Li, Beibei
    Shi, Yaxin
    Guo, Yuqing
    Kong, Qinglei
    Jiang, Yukun
    IEEE INFOCOM 2022 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2022,
  • [45] Bearing Faulty Prediction Method Based on Federated Transfer Learning and Knowledge Distillation
    Zhou, Yiqing
    Wang, Jian
    Wang, Zeru
    MACHINES, 2022, 10 (05)
  • [46] Personalized federated learning via decoupling self-knowledge distillation and global adaptive aggregation
    Tang, Zhiwei
    Xu, Shuwei
    Jin, Haozhe
    Liu, Shichong
    Zhai, Rui
    Lu, Ke
    MULTIMEDIA SYSTEMS, 2025, 31 (02)
  • [47] Personalized Federated Learning Based on Bidirectional Knowledge Distillation for WiFi Gesture Recognition
    Geng, Huan
    Deng, Dongshang
    Zhang, Weidong
    Ji, Ping
    Wu, Xuangou
    ELECTRONICS, 2023, 12 (24)
  • [48] FedMEKT: Distillation-based embedding knowledge transfer for multimodal federated learning
    Le, Huy Q.
    Nguyen, Minh N. H.
    Thwal, Chu Myaet
    Qiao, Yu
    Zhang, Chaoning
    Hong, Choong Seon
    NEURAL NETWORKS, 2025, 183
  • [49] Ensemble Distillation Based Adaptive Quantization for Supporting Federated Learning in Wireless Networks
    Liu, Yi-Jing
    Feng, Gang
    Niyato, Dusit
    Qin, Shuang
    Zhou, Jianhong
    Li, Xiaoqian
    Xu, Xinyi
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (06) : 4013 - 4027
  • [50] Distributional Knowledge Transfer for Heterogeneous Federated Learning
    Wang, Luau
    Wang, Lijuan
    Shcn, Jun
    2022 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING, ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM, 2022, : 747 - 754