HADFL: Heterogeneity-aware Decentralized Federated Learning Framework

被引:16
|
作者
Cao, Jing [1 ]
Lian, Zirui [1 ]
Liu, Weihong [1 ]
Zhu, Zongwei [1 ]
Ji, Cheng [2 ]
机构
[1] Univ Sci & Technol China, Hefei, Anhui, Peoples R China
[2] Nanjing Univ Sci & Technol, Nanjing, Peoples R China
基金
中国博士后科学基金;
关键词
Distributed Training; Machine Learning; Federated Learning; Heterogeneous Computing;
D O I
10.1109/DAC18074.2021.9586101
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) supports training models on geographically distributed devices. However, traditional FL systems adopt a centralized synchronous strategy, putting high communication pressure and model generalization challenge. Existing optimizations on FL either fail to speedup training on heterogeneous devices or suffer from poor communication efficiency. In this paper, we propose HADFL, a framework that supports decentralized asynchronous training on heterogeneous devices. The devices train model locally with heterogeneity-aware local steps using local data. In each aggregation cycle, they are selected based on probability to perform model synchronization and aggregation. Compared with the traditional FL system, HADFL can relieve the central server's communication pressure, efficiently utilize heterogeneous computing power, and can achieve a maximum speedup of 3.15x than decentralized-FedAvg and 4.68x than Pytorch distributed training scheme, respectively, with almost no loss of convergence accuracy.
引用
收藏
页码:1 / 6
页数:6
相关论文
共 50 条
  • [21] Heterogeneity-Aware Memory Efficient Federated Learning via Progressive Layer Freezing
    Wu, Yebo
    Li, Li
    Tian, Chunlin
    Chang, Tao
    Lin, Chi
    Wang, Cong
    Xu, Cheng-Zhong
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [22] Heterogeneity-Aware Cooperative Federated Edge Learning With Adaptive Computation and Communication Compression
    Zhang, Zhenxiao
    Gao, Zhidong
    Guo, Yuanxiong
    Gong, Yanmin
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (03) : 2073 - 2084
  • [23] A Heterogeneity-Aware Semi-Decentralized Model for a Lightweight Intrusion Detection System for IoT Networks Based on Federated Learning and BiLSTM
    Alsaleh, Shuroog
    Menai, Mohamed El Bachir
    Al-Ahmadi, Saad
    SENSORS, 2025, 25 (04)
  • [24] SDPIPE: A Semi-Decentralized Framework for Heterogeneity-aware Pipeline-parallel Training
    Miao, Xupeng
    Shi, Yining
    Yang, Zhi
    Cui, Bin
    Jia, Zhihao
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2023, 16 (09): : 2354 - 2363
  • [25] A two-phase half-async method for heterogeneity-aware federated learning
    Ma, Tianyi
    Mao, Bingcheng
    Chen, Ming
    NEUROCOMPUTING, 2022, 485 : 134 - 154
  • [26] FedDM: Data and Model Heterogeneity-Aware Federated Learning via Dynamic Weight Sharing
    Shen, Leming
    Zheng, Yuanqing
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 975 - 976
  • [27] Heterogeneity-Aware Coordination for Federated Learning via Stitching Pre-trained blocks
    Zhan, Shichen
    Wu, Yebo
    Tian, Chunlin
    Zha, Yan
    Li, Li
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [28] FedVisual: Heterogeneity-Aware Model Aggregation for Federated Learning in Visual-Based Vehicular Crowdsensing
    Zhang, Wenjun
    Liu, Xiaoli
    Zhang, Ruoyi
    Zhu, Chao
    Tarkoma, Sasu
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (22): : 36191 - 36202
  • [29] An Efficient Blockchain Assisted Reputation Aware Decentralized Federated Learning Framework
    Kasyap, Harsh
    Manna, Arpan
    Tripathy, Somanath
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (03): : 2771 - 2782
  • [30] Prague: High-Performance Heterogeneity-Aware Asynchronous Decentralized Training
    Luo, Qinyi
    He, Jiaao
    Zhuo, Youwei
    Qian, Xuehai
    TWENTY-FIFTH INTERNATIONAL CONFERENCE ON ARCHITECTURAL SUPPORT FOR PROGRAMMING LANGUAGES AND OPERATING SYSTEMS (ASPLOS XXV), 2020, : 401 - 416