A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation

被引:5
|
作者
Zhou, Zhongchang [1 ]
Sun, Fenggang [1 ]
Chen, Xiangyu [1 ]
Zhang, Dongxu [2 ]
Han, Tianzhen [3 ]
Lan, Peng [1 ]
机构
[1] Shandong Agr Univ, Coll Informat Sci & Engn, Tai An 271018, Peoples R China
[2] Taishan Intelligent Mfg Ind Res Inst, Tai An 271000, Peoples R China
[3] Taian Chinamobile, Network Dept Optimizat Ctr, Tai An 271000, Peoples R China
关键词
federated learning; node selection; decentralized learning; knowledge distillation;
D O I
10.3390/math11143162
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Federated learning has become increasingly important for modern machine learning, especially for data privacy sensitive scenarios. Existing federated learning mainly adopts a central server-based network topology, however, the training process of which is susceptible to the central node. To address this problem, this article proposed a decentralized federated learning method based on node selection and knowledge distillation. Specifically, the central node in this method is variable, and it is selected by the indicator interaction between nodes. Meanwhile, the knowledge distillation mechanism is added to make the student model as close as possible to the teacher's network and ensure the model's accuracy. The experiments were conducted on the public MNIST, CIFAR-10, and FEMNIST datasets for both the Independent Identically Distribution (IID) setting and the non-IID setting. Numerical results show that the proposed method can achieve an improved accuracy as compared to the centralized federated learning method, and the computing time is reduced greatly with less accuracy loss as compared to the blockchain decentralized federated learning. Therefore, the proposed method guarantees the model effect while meeting the individual model requirements of each node and reducing the running time.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] FedRCIL: Federated Knowledge Distillation for Representation based Contrastive Incremental Learning
    Psaltis, Athanasios
    Chatzikonstantinou, Christos
    Patrikakis, Charalampos Z.
    Daras, Petros
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3455 - 3464
  • [22] Decentralized Federated Learning via Mutual Knowledge Transfer
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (02) : 1136 - 1147
  • [23] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719
  • [24] Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation
    Mohammed, Malik Naik
    Zhang, Xinyue
    Valero, Maria
    Xie, Ying
    2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, : 207 - 208
  • [25] GNN-based Neighbor Selection and Resource Allocation for Decentralized Federated Learning
    Meng, Chuiyang
    Tang, Ming
    Setayesh, Mehdi
    Wong, Vincent W. S.
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1223 - 1228
  • [26] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
  • [27] FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
    Han, Sungwon
    Park, Sungwon
    Wu, Fangzhao
    Kim, Sundong
    Wu, Chuhan
    Xie, Xing
    Cha, Meeyoung
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 691 - 707
  • [28] Heterogeneous Defect Prediction Based on Federated Transfer Learning via Knowledge Distillation
    Wang, Aili
    Zhang, Yutong
    Yan, Yixin
    IEEE ACCESS, 2021, 9 : 29530 - 29540
  • [29] Incentive and Knowledge Distillation Based Federated Learning for Cross-Silo Applications
    Li, Beibei
    Shi, Yaxin
    Guo, Yuqing
    Kong, Qinglei
    Jiang, Yukun
    IEEE INFOCOM 2022 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2022,
  • [30] Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space
    Taya, Akihito
    Nishio, Takayuki
    Morikura, Masahiro
    Yamamoto, Koji
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2022, 8 : 799 - 814