A two-phase half-async method for heterogeneity-aware federated learning

被引:3
|
作者
Ma, Tianyi [1 ,2 ]
Mao, Bingcheng [1 ,2 ]
Chen, Ming [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Zhejiang, Peoples R China
[2] Hithink RoyalFlush Informat Network Co Ltd, Hangzhou, Zhejiang, Peoples R China
关键词
Federated learning; Federated optimization; Non-IID data;
D O I
10.1016/j.neucom.2021.08.146
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) is a distributed machine learning paradigm that allows training models on decentralized data over large-scale edge/mobile devices without collecting raw data. However, existing methods are still far from efficient and stable under extreme statistical and environmental heterogeneity. In this work, we propose FedHA (Federated Heterogeineity Awareness), a novel half-async algorithm which simultaneously incorporates the merits of asynchronous and synchronous methods. It separates the training into two phases by estimating the consistency of optimization directions of collected local models. It applies different strategies to facilitate fast and stable training, namely model selection, adaptive local epoch, and heterogeneity weighted aggregation in these phases. We provide theoretical convergence and communication guarantees on both convex and non-convex problems without introducing extra assumptions. In the first phase (the consistent phase), the convergence rate of FedHA is O (1/e(T)), which is faster than existing methods while reducing communication. In the second phase (inconsistent phase), FedHA retains the best-known results in convergence (O(1/T)) and communication (O(1/c)). We validate our proposed algorithm on different tasks with both IID (Independently and Identically Distributed) and non-IID data, and results show that our algorithm is efficient, stable, and flexible under the twofold heterogeneity using the proposed strategies. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:134 / 154
页数:21
相关论文
共 50 条
  • [31] Accelerating Federated Learning with Two-phase Gradient Adjustment
    Wang, Jiajun
    Mao, Yingchi
    He, Xiaoming
    Zhou, Tong
    Wu, Jun
    Wu, Jie
    2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 810 - 817
  • [32] Heterogeneity-aware Deep Learning Workload Deployments on the Computing Continuum
    Bouvier, Thomas
    2021 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW), 2021, : 1027 - 1027
  • [33] Petrel: Heterogeneity-Aware Distributed Deep Learning Via Hybrid Synchronization
    Zhou, Qihua
    Guo, Song
    Qu, Zhihao
    Li, Peng
    Li, Li
    Guo, Minyi
    Wang, Kun
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2021, 32 (05) : 1030 - 1043
  • [34] Heterogeneity-Aware Distributed Machine Learning Training via Partial Reduce
    Miao, Xupeng
    Nie, Xiaonan
    Shao, Yingxia
    Yang, Zhi
    Jiang, Jiawei
    Ma, Lingxiao
    Cui, Bin
    SIGMOD '21: PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2021, : 2262 - 2270
  • [35] HiFlash: Communication-Efficient Hierarchical Federated Learning With Adaptive Staleness Control and Heterogeneity-Aware Client-Edge Association
    Wu, Qiong
    Chen, Xu
    Ouyang, Tao
    Zhou, Zhi
    Zhang, Xiaoxi
    Yang, Shusen
    Zhang, Junshan
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (05) : 1560 - 1579
  • [36] A Heterogeneity-Aware Semi-Decentralized Model for a Lightweight Intrusion Detection System for IoT Networks Based on Federated Learning and BiLSTM
    Alsaleh, Shuroog
    Menai, Mohamed El Bachir
    Al-Ahmadi, Saad
    SENSORS, 2025, 25 (04)
  • [37] A Heterogeneity-Aware Car-Following Model: Based on the XGBoost Method
    Zhu, Kefei
    Yang, Xu
    Zhang, Yanbo
    Liang, Mengkun
    Wu, Jun
    ALGORITHMS, 2024, 17 (02)
  • [38] Joint heterogeneity-aware personalized federated search for energy efficient battery-powered edge computing
    Yang, Zhao
    Zhang, Shengbing
    Li, Chuxi
    Wang, Miao
    Yang, Jiaying
    Zhang, Meng
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 146 : 178 - 194
  • [39] The Price of Labelling: A Two-Phase Federated Self-learning Approach
    Aladwani, Tahani
    Parambath, Shameem Puthiya
    Anagnostopoulos, Christos
    Deligianni, Fani
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT IV, ECML PKDD 2024, 2024, 14944 : 126 - 142
  • [40] Heterogeneity-aware transfer learning for high-dimensional linear regression models
    Peng, Yanjin
    Wang, Lei
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2025, 206