Enabling efficient and low-effort decentralized federated learning with the EdgeFL framework

被引:0
|
作者
Zhang, Hongyi [1 ]
Bosch, Jan [1 ]
Olsson, Helena Holmstrom [2 ]
机构
[1] Chalmers Univ Technol, Gothenburg, Sweden
[2] Malmo Univ, Malmo, Sweden
基金
瑞典研究理事会;
关键词
Federated learning; Machine learning; Software engineering; Decentralized architecture; Information privacy; DATA PRIVACY;
D O I
10.1016/j.infsof.2024.107600
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Context: Federated Learning (FL) has gained prominence as a solution for preserving data privacy in machine learning applications. However, existing FL frameworks pose challenges for software engineers due to implementation complexity, limited customization options, and scalability issues. These limitations prevent the practical deployment of FL, especially in dynamic and resource-constrained edge environments, preventing its widespread adoption. Objective: To address these challenges, we propose EdgeFL, an efficient and low-effort FL framework designed to overcome centralized aggregation, implementation complexity and scalability limitations. EdgeFL applies a decentralized architecture that eliminates reliance on a central server by enabling direct model training and aggregation among edge nodes, which enhances fault tolerance and adaptability to diverse edge environments. Methods: We conducted experiments and a case study to demonstrate the effectiveness of EdgeFL. Our approach focuses on reducing weight update latency and facilitating faster model evolution on edge devices. Results: Our findings indicate that EdgeFL outperforms existing FL frameworks in terms of learning efficiency and performance. By enabling quicker model evolution on edge devices, EdgeFL enhances overall efficiency and responsiveness to changing data patterns. Conclusion: EdgeFL offers a solution for software engineers and companies seeking the benefits of FL, while effectively overcoming the challenges and privacy concerns associated with traditional FL frameworks. Its decentralized approach, simplified implementation, combined with enhanced customization and fault tolerance, make it suitable for diverse applications and industries.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Decentralized federated learning based on blockchain: concepts, framework, and challenges
    Zhang, Haoran
    Jiang, Shan
    Xuan, Shichang
    COMPUTER COMMUNICATIONS, 2024, 216 : 140 - 150
  • [22] HADFL: Heterogeneity-aware Decentralized Federated Learning Framework
    Cao, Jing
    Lian, Zirui
    Liu, Weihong
    Zhu, Zongwei
    Ji, Cheng
    2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 1 - 6
  • [23] FLoBC: A Decentralized Blockchain-Based Federated Learning Framework
    Ghanem, Mohamed
    Dawoud, Fadi
    Gamal, Habiba
    Soliman, Eslam
    El-Batt, Tamer
    El-Batt, Tamer
    2022 FOURTH INTERNATIONAL CONFERENCE ON BLOCKCHAIN COMPUTING AND APPLICATIONS (BCCA), 2022, : 85 - 92
  • [24] APPLE picker: Automatic particle picking, a low-effort cryo-EM framework
    Heimowitz, Ayelet
    Anden, Joakim
    Singer, Amit
    JOURNAL OF STRUCTURAL BIOLOGY, 2018, 204 (02) : 215 - 227
  • [25] Efficient decentralized optimization for edge-enabled smart manufacturing: A federated learning-based framework
    Liu, Huan
    Li, Shiyong
    Li, Wenzhe
    Sun, Wei
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 157 : 422 - 435
  • [26] EMGSense: A Low-Effort Self-Supervised Domain Adaptation Framework for EMG Sensing
    Duan, Di
    Yang, Huanqi
    Lan, Guohao
    Li, Tianxing
    Jia, Xiaohua
    Xu, Weitao
    2023 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS, PERCOM, 2023, : 160 - 170
  • [27] AEDFL: Efficient Asynchronous Decentralized Federated Learning with Heterogeneous Devices
    Liu, Ji
    Che, Tianshi
    Zhou, Yang
    Jin, Ruoming
    Dai, Huaiyu
    Dou, Dejing
    Valduriez, Patrick
    PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2024, : 833 - 841
  • [28] Communication-Efficient Design for Quantized Decentralized Federated Learning
    Chen, Li
    Liu, Wei
    Chen, Yunfei
    Wang, Weidong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1175 - 1188
  • [29] EFFICIENT AND RELIABLE OVERLAY NETWORKS FOR DECENTRALIZED FEDERATED LEARNING\ast
    Hua, Yifan
    Miller, Kevin
    Bertozzi, Andrea L.
    Qian, Chen
    Wang, Bao
    SIAM JOURNAL ON APPLIED MATHEMATICS, 2022, 82 (04) : 1558 - 1586
  • [30] Communication-efficient and Scalable Decentralized Federated Edge Learning
    Yapp, Austine Zong Han
    Koh, Hong Soo Nicholas
    Lai, Yan Ting
    Kang, Jiawen
    Li, Xuandi
    Ng, Jer Shyuan
    Jiang, Hongchao
    Lim, Wei Yang Bryan
    Xiong, Zehui
    Niyato, Dusit
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 5032 - 5035