A Weighted Average Consensus Approach for Decentralized Federated Learning

被引:15
|
作者
Giuseppi, Alessandro [1 ]
Manfredi, Sabato [2 ]
Pietrabissa, Antonio [1 ]
机构
[1] Univ Roma La Sapienza, Dept Comp Control & Management Engn, I-00185 Rome, Italy
[2] Univ Naples Federico II, Dept Elect Engn & Informat Technol, I-80125 Naples, Italy
关键词
Federated learning (FedL); deep learning; federated averaging (FedAvg); machine learning (ML); artificial intelligence; discrete-time consensus; distributed systems; CONVERGENCE ANALYSIS; NETWORKS;
D O I
10.1007/s11633-022-1338-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FedL) is a machine learning (ML) technique utilized to train deep neural networks (DeepNNs) in a distributed way without the need to share data among the federated training clients. FedL was proposed for edge computing and Internet of things (IoT) tasks in which a centralized server was responsible for coordinating and governing the training process. To remove the design limitation implied by the centralized entity, this work proposes two different solutions to decentralize existing FedL algorithms, enabling the application of FedL on networks with arbitrary communication topologies, and thus extending the domain of application of FedL to more complex scenarios and new tasks. Of the two proposed algorithms, one, called FedLCon, is developed based on results from discrete-time weighted average consensus theory and is able to reconstruct the performances of the standard centralized FedL solutions, as also shown by the reported validation tests.
引用
收藏
页码:319 / 330
页数:12
相关论文
共 50 条
  • [31] Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space
    The College of Science and Engineering, Department of Integrated Information Technology, Aoyama Gakuin University, Sagamihara
    252-5258, Japan
    不详
    不详
    606-8501, Japan
    arXiv, 1600,
  • [32] IPLS: A Framework for Decentralized Federated Learning
    Pappas, Christodoulos
    Chatzopoulos, Dimitris
    Lalis, Spyros
    Vavalis, Manolis
    2021 IFIP NETWORKING CONFERENCE AND WORKSHOPS (IFIP NETWORKING), 2021,
  • [33] STATISTICAL INFERENCE FOR DECENTRALIZED FEDERATED LEARNING
    Gu, Jia
    Chen, Song xi
    ANNALS OF STATISTICS, 2024, 52 (06): : 2931 - 2955
  • [34] Decentralized Federated Learning With Unreliable Communications
    Ye, Hao
    Liang, Le
    Li, Geoffrey Ye
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 487 - 500
  • [35] Decentralized coordination for resilient federated learning: A blockchain-based approach with smart contracts and decentralized storage
    Ferretti, Stefano
    Cassano, Lorenzo
    Cialone, Gabriele
    D'Abramo, Jacopo
    Imboccioli, Filippo
    COMPUTER COMMUNICATIONS, 2025, 236
  • [36] From Matrix-Weighted Consensus to Multipartite Average Consensus
    Kwon, Seong-Ho
    Bae, Yoo-Bin
    Liu, Ji
    Ahn, Hyo-Sung
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2020, 7 (04): : 1609 - 1620
  • [37] CLC: A Consensus-based Label Correction Approach in Federated Learning
    Zeng, Bixiao
    Yang, Xiaodong
    Chen, Yiqiang
    Yu, Hanchao
    Zhang, Yingwei
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (05)
  • [38] Federated Learning With Cooperating Devices: A Consensus Approach for Massive IoT Networks
    Savazzi, Stefano
    Nicoli, Monica
    Rampa, Vittorio
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (05) : 4641 - 4654
  • [39] Confederated Learning: Federated Learning With Decentralized Edge Servers
    Wang, Bin
    Fang, Jun
    Li, Hongbin
    Yuan, Xiaojun
    Ling, Qing
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 248 - 263
  • [40] Federated Machine Learning for Skin Lesion Diagnosis: An Asynchronous and Weighted Approach
    Yaqoob, Muhammad Mateen
    Alsulami, Musleh
    Khan, Muhammad Amir
    Alsadie, Deafallah
    Saudagar, Abdul Khader Jilani
    AlKhathami, Mohammed
    DIAGNOSTICS, 2023, 13 (11)