Asynchronous Federated Learning for Sensor Data with Concept Drift

被引:22
|
作者
Chen, Yujing [1 ]
Chai, Zheng [1 ]
Cheng, Yue [1 ]
Rangwala, Huzefa [1 ]
机构
[1] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
关键词
federated learning; asynchronous learning; concept drift; communication-efficient; CLASSIFICATION;
D O I
10.1109/BigData52589.2021.9671924
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) involves multiple distributed devices jointly training a shared model without any of the participants having to reveal their local data to a centralized server. Most of previous FL approaches assume that data on devices are fixed and stationary during the training process. However, this assumption is unrealistic because these devices usually have varying sampling rates and different system configurations. In addition, the underlying distribution of the device data can change dynamically over time, which is known as concept drift. Concept drift makes the learning process complicated because of the inconsistency between existing and upcoming data. Traditional concept drift handling techniques such as chunk based and ensemble learning-based methods are not suitable in the federated learning frameworks due to the heterogeneity of local devices. We propose a novel approach, FedConD, to detect and deal with the concept drift on local devices and minimize the effect on the performance of models in asynchronous FL. The drift detection strategy is based on an adaptive mechanism which uses the historical performance of the local models. The drift adaptation is realized by adjusting the regularization parameter of objective function on each local device. Additionally, we design a communication strategy on the server side to select local updates in a prudent fashion and speed up model convergence. Experimental evaluations on three evolving data streams and two image datasets show that FedConD detects and handles concept drift, and also reduces the overall communication cost compared to other baseline methods.
引用
收藏
页码:4822 / 4831
页数:10
相关论文
共 50 条
  • [31] Dynamical Targeted Ensemble Learning for Streaming Data With Concept Drift
    Guo, Husheng
    Zhang, Yang
    Wang, Wenjian
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 8023 - 8036
  • [32] Learning from streaming data with concept drift and imbalance: an overview
    T. Ryan Hoens
    Robi Polikar
    Nitesh V. Chawla
    Progress in Artificial Intelligence, 2012, 1 (1) : 89 - 101
  • [33] Online Federated Learning via Non-Stationary Detection and Adaptation Amidst Concept Drift
    Ganguly, Bhargav
    Aggarwal, Vaneet
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (01) : 643 - 653
  • [34] Asynchronous federated learning on heterogeneous devices: A survey
    Xu, Chenhao
    Qu, Youyang
    Xiang, Yong
    Gao, Longxiang
    COMPUTER SCIENCE REVIEW, 2023, 50
  • [35] Asynchronous Decentralized Federated Learning for Heterogeneous Devices
    Liao, Yunming
    Xu, Yang
    Xu, Hongli
    Chen, Min
    Wang, Lun
    Qiao, Chunming
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (05) : 4535 - 4550
  • [36] Hierarchical Optimization for Asynchronous Vertical Federated Learning
    Li, Xinchao
    Zhang, Zhixian
    Yang, Shiyou
    Zhou, Xuhua
    HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 2025, 15
  • [37] When Blockchain Meets Asynchronous Federated Learning
    Jing, Rui
    Chen, Wei
    Wu, Xiaoxin
    Wang, Zehua
    Tian, Zijian
    Zhang, Fan
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IX, ICIC 2024, 2024, 14870 : 199 - 207
  • [38] Efficient Asynchronous Federated Learning for AUV Swarm
    Meng, Zezhao
    Li, Zhi
    Hou, Xiangwang
    Du, Jun
    Chen, Jianrui
    Wei, Wei
    SENSORS, 2022, 22 (22)
  • [39] Efficient asynchronous federated learning with sparsification and quantization
    Jia, Juncheng
    Liu, Ji
    Zhou, Chendi
    Tian, Hao
    Dong, Mianxiong
    Dou, Dejing
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2024, 36 (09):
  • [40] Wireless Federated Learning With Asynchronous and Quantized Updates
    Huang, Peishan
    Li, Dong
    Yan, Zhigang
    IEEE COMMUNICATIONS LETTERS, 2023, 27 (09) : 2393 - 2397