A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation

被引:5
|
作者
Zhou, Zhongchang [1 ]
Sun, Fenggang [1 ]
Chen, Xiangyu [1 ]
Zhang, Dongxu [2 ]
Han, Tianzhen [3 ]
Lan, Peng [1 ]
机构
[1] Shandong Agr Univ, Coll Informat Sci & Engn, Tai An 271018, Peoples R China
[2] Taishan Intelligent Mfg Ind Res Inst, Tai An 271000, Peoples R China
[3] Taian Chinamobile, Network Dept Optimizat Ctr, Tai An 271000, Peoples R China
关键词
federated learning; node selection; decentralized learning; knowledge distillation;
D O I
10.3390/math11143162
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Federated learning has become increasingly important for modern machine learning, especially for data privacy sensitive scenarios. Existing federated learning mainly adopts a central server-based network topology, however, the training process of which is susceptible to the central node. To address this problem, this article proposed a decentralized federated learning method based on node selection and knowledge distillation. Specifically, the central node in this method is variable, and it is selected by the indicator interaction between nodes. Meanwhile, the knowledge distillation mechanism is added to make the student model as close as possible to the teacher's network and ensure the model's accuracy. The experiments were conducted on the public MNIST, CIFAR-10, and FEMNIST datasets for both the Independent Identically Distribution (IID) setting and the non-IID setting. Numerical results show that the proposed method can achieve an improved accuracy as compared to the centralized federated learning method, and the computing time is reduced greatly with less accuracy loss as compared to the blockchain decentralized federated learning. Therefore, the proposed method guarantees the model effect while meeting the individual model requirements of each node and reducing the running time.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [2] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [3] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [4] Implications of Node Selection in Decentralized Federated Learning
    Lodhi, Ahnaf Hannan
    Akgun, Baris
    Ozkasap, Öznur
    2023 31ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2023,
  • [5] DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION
    Huang, Yue
    Kong, Lanju
    Li, Qingzhong
    Zhang, Baochen
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 342 - 347
  • [6] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [7] FeDZIO: Decentralized Federated Knowledge Distillation on Edge Devices
    Palazzo, Luca
    Pennisi, Matteo
    Bellitto, Giovanni
    Kavasidis, Isaak
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2023 WORKSHOPS, PT II, 2024, 14366 : 201 - 210
  • [8] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation
    Zhang, Jianfei
    Shi, Yongqiang
    ELECTRONICS, 2024, 13 (05)
  • [9] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [10] A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation
    Sun Y.
    Shi Y.
    Wang Z.
    Li M.
    Si P.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (01): : 12 - 18