A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation

被引:5
|
作者
Zhou, Zhongchang [1 ]
Sun, Fenggang [1 ]
Chen, Xiangyu [1 ]
Zhang, Dongxu [2 ]
Han, Tianzhen [3 ]
Lan, Peng [1 ]
机构
[1] Shandong Agr Univ, Coll Informat Sci & Engn, Tai An 271018, Peoples R China
[2] Taishan Intelligent Mfg Ind Res Inst, Tai An 271000, Peoples R China
[3] Taian Chinamobile, Network Dept Optimizat Ctr, Tai An 271000, Peoples R China
关键词
federated learning; node selection; decentralized learning; knowledge distillation;
D O I
10.3390/math11143162
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Federated learning has become increasingly important for modern machine learning, especially for data privacy sensitive scenarios. Existing federated learning mainly adopts a central server-based network topology, however, the training process of which is susceptible to the central node. To address this problem, this article proposed a decentralized federated learning method based on node selection and knowledge distillation. Specifically, the central node in this method is variable, and it is selected by the indicator interaction between nodes. Meanwhile, the knowledge distillation mechanism is added to make the student model as close as possible to the teacher's network and ensure the model's accuracy. The experiments were conducted on the public MNIST, CIFAR-10, and FEMNIST datasets for both the Independent Identically Distribution (IID) setting and the non-IID setting. Numerical results show that the proposed method can achieve an improved accuracy as compared to the centralized federated learning method, and the computing time is reduced greatly with less accuracy loss as compared to the blockchain decentralized federated learning. Therefore, the proposed method guarantees the model effect while meeting the individual model requirements of each node and reducing the running time.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Graph Federated Learning Based on the Decentralized Framework
    Liu, Peilin
    Tang, Yanni
    Zhang, Mingyue
    Chen, Wu
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 452 - 463
  • [42] Blockchain-Based Decentralized Federated Learning
    Dirir, Ahmed
    Salah, Khaled
    Svetinovic, Davor
    Jayaraman, Raja
    Yaqoob, Ibrar
    Kanhere, Salil S.
    2022 FOURTH INTERNATIONAL CONFERENCE ON BLOCKCHAIN COMPUTING AND APPLICATIONS (BCCA), 2022, : 99 - 106
  • [43] Efficient Federated Learning for AIoT Applications Using Knowledge Distillation
    Liu, Tian
    Xia, Jun
    Ling, Zhiwei
    Fu, Xin
    Yu, Shui
    Chen, Mingsong
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (08) : 7229 - 7243
  • [44] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [45] Mitigation of Membership Inference Attack by Knowledge Distillation on Federated Learning
    Ueda, Rei
    Nakai, Tsunato
    Yoshida, Kota
    Fujino, Takeshi
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2025, E108A (03) : 267 - 279
  • [46] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [47] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
    Lee, Gihun
    Jeong, Minchan
    Shin, Yongjin
    Bae, Sangmin
    Yun, Se-Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [48] Federated Learning and Reputation-Based Node Selection Scheme for Internet of Vehicles
    Su, Zhaoyu
    Cheng, Ruimin
    Li, Chunhai
    Chen, Mingfeng
    Zhu, Jiangnan
    Long, Yan
    ELECTRONICS, 2025, 14 (02):
  • [49] Resource Allocation for Federated Knowledge Distillation Learning in Internet of Drones
    Yao, Jingjing
    Cal, Semih
    Sun, Xiang
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (07): : 8064 - 8074
  • [50] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139