FedREM: Guided Federated Learning in the Presence of Dynamic Device Unpredictability

被引:2
|
作者
Lan, Linsi [1 ,2 ]
Wang, Junbo [1 ,2 ]
Li, Zhi [1 ,2 ]
Kant, Krishna [3 ]
Liu, Wanquan [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Shenzhen 510006, Peoples R China
[2] Guangdong Prov Key Lab Fire Sci & Intelligent Emer, Guangzhou 510006, Peoples R China
[3] Temple Univ, Philadelphia, PA 19122 USA
基金
中国国家自然科学基金;
关键词
Federated learning; internet of vehicle; data heterogeneity; NETWORKS;
D O I
10.1109/TPDS.2024.3396133
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated learning (FL) is a promising distributed machine learning scheme where multiple clients collaborate by sharing a common learning model while maintaining their private data locally. It can be applied to a lot of applications, e.g., training an automatic driving system by the perception of multiple vehicles. However, some clients may join the training system dynamically, which affects the stability and accuracy of the learning system a IoT. Meanwhile, data heterogeneity in the FL system exacerbates the above problem further due to imbalanced data distribution. To solve the above problems, we propose a novel FL framework named FedREM (Retain-Expansion and Matching), which guides clients training models by two mechanisms. They are 1) a Retain-Expansion mechanism that can let clients perform local training and extract data characteristics automatically during the training and 2) a Matching mechanism that can ensure new clients quickly adapt to the global model based on matching their data characteristics and adjusting the model accordingly. Results of extensive experiments verify that our FedREM outperforms various baselines in terms of model accuracy, communication efficiency, and system robustness.
引用
收藏
页码:1189 / 1206
页数:18
相关论文
共 50 条
  • [1] Fast Federated Learning in the Presence of Arbitrary Device Unavailability
    Gu, Xinran
    Huang, Kaixuan
    Zhang, Jingzhao
    Huang, Longbo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Dynamic Federated Learning
    Rizk, Elsa
    Vlaski, Stefan
    Sayed, Ali H.
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [3] GDFed: Dynamic Federated Learning for Heterogenous Device Using Graph Neural Network
    Yoon, Ji Su
    Kang, Sun Moo
    Park, Seong Bae
    Hong, Choong Seon
    2023 INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN, 2023, : 683 - 685
  • [4] Global Update Guided Federated Learning
    Wu, Qilong
    Liu, Lin
    Xue, Shibei
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 2434 - 2439
  • [5] Federated Learning with Downlink Device Selection
    Amiri, Mohammad Mohammadi
    Kulkarni, Sanjeev R.
    Poor, H. Vincent
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 306 - 310
  • [6] Dynamic Clustering in Federated Learning
    Kim, Yeongwoo
    Al Hakim, Ezeddin
    Haraldson, Johan
    Eriksson, Henrik
    da Silva, Jose Mairton B., Jr.
    Fischione, Carlo
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [7] Blockchained On-Device Federated Learning
    Kim, Hyesung
    Park, Jihong
    Bennis, Mehdi
    Kim, Seong-Lyun
    IEEE COMMUNICATIONS LETTERS, 2020, 24 (06) : 1279 - 1283
  • [8] Concept-Guided Interpretable Federated Learning
    Yang, Jianan
    Long, Guodong
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI 2023, PT II, 2024, 14472 : 160 - 172
  • [9] FedQL: Q-Learning Guided Aggregation for Federated Learning
    Cao, Mei
    Zhao, Mengying
    Zhang, Tingting
    Yu, Nanxiang
    Lu, Jianbo
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT I, 2024, 14487 : 263 - 282
  • [10] Device Selection Methods in Federated Learning: A Survey
    Aditee Mattoo
    Neeraj Jain
    Charu Gandhi
    SN Computer Science, 5 (6)