Federated Continuous Learning With Broad Network Architecture

被引:38
|
作者
Le, Junqing [1 ]
Lei, Xinyu [2 ]
Mu, Nankun [3 ,4 ]
Zhang, Hengrun [5 ,6 ]
Zeng, Kai [5 ,6 ]
Liao, Xiaofeng [3 ,4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
[3] Chongqing Univ, Key Lab Dependable Serv Comp Cyber Phys Soc, Minist Educ, Chongqing, Peoples R China
[4] Chongqing Univ, Coll Comp Sci, Chongqing, Peoples R China
[5] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
[6] George Mason Univ, Dept Elect & Comp Engn, Fairfax, VA 22030 USA
基金
中国国家自然科学基金; 中国博士后科学基金; 国家重点研发计划;
关键词
Servers; Computational modeling; Training; Training data; Adaptation models; Data models; Real-time systems; Broad learning (BL); catastrophic forgetting; continuous learning; federated learning (FL);
D O I
10.1109/TCYB.2021.3090260
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a machine-learning setting, where multiple clients collaboratively train a model under the coordination of a central server. The clients' raw data are locally stored, and each client only uploads the trained weight to the server, which can mitigate the privacy risks from the centralized machine learning. However, most of the existing FL models focus on one-time learning without consideration for continuous learning. Continuous learning supports learning from streaming data continuously, so it can adapt to environmental changes and provide better real-time performance. In this article, we present a federated continuous learning scheme based on broad learning (FCL-BL) to support efficient and accurate federated continuous learning (FCL). In FCL-BL, we propose a weighted processing strategy to solve the catastrophic forgetting problem, so FCL-BL can handle continuous learning. Then, we develop a local-independent training solution to support fast and accurate training in FCL-BL. The proposed solution enables us to avoid using a time-consuming synchronous approach while addressing the inaccurate-training issue rooted in the previous asynchronous approach. Moreover, we introduce a batch-asynchronous approach and broad learning (BL) technique to guarantee the high efficiency of FCL-BL. Specifically, the batch-asynchronous approach reduces the number of client-server interaction rounds, and the BL technique supports incremental learning without retraining when learning newly produced data. Finally, theoretical analysis and experimental results further illustrate that FCL-BL is superior to the existing FL schemes in terms of efficiency and accuracy in FCL.
引用
收藏
页码:3874 / 3888
页数:15
相关论文
共 50 条
  • [31] A federated learning architecture for secure and private neuroimaging analysis
    Stripelis, Dimitris
    Gupta, Umang
    Saleem, Hamza
    Dhinagar, Nikhil
    Ghai, Tanmay
    Anastasiou, Chrysovalantis
    Sanchez, Rafael
    Steeg, Greg Ver
    Ravi, Srivatsan
    Naveed, Muhammad
    Thompson, Paul M.
    Ambite, Jose Luis
    PATTERNS, 2024, 5 (08):
  • [32] Collaborative Neural Architecture Search for Personalized Federated Learning
    Liu, Yi
    Guo, Song
    Zhang, Jie
    Hong, Zicong
    Zhan, Yufeng
    Zhou, Qihua
    IEEE TRANSACTIONS ON COMPUTERS, 2025, 74 (01) : 250 - 262
  • [33] Personalized Federated Learning with Multi-branch Architecture
    Mori, Junki
    Yoshiyama, Tomoyuki
    Furukawa, Ryo
    Teranishi, Isamu
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [34] A Federated Learning Architecture for Blockchain DDoS Attacks Detection
    Xu, Chang
    Jin, Guoxie
    Lu, Rongxing
    Zhu, Liehuang
    Shen, Xiaodong
    Guan, Yunguo
    Sharif, Kashif
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (05) : 1911 - 1923
  • [35] Architecture-Based FedAvg for Vertical Federated Learning
    Casella, Bruno
    Fonio, Samuele
    16TH IEEE/ACM INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING, UCC 2023, 2023,
  • [36] Monitoring Concept Drift in Continuous Federated Learning Platforms
    Dusing, Christoph
    Cimiano, Philipp
    ADVANCES IN INTELLIGENT DATA ANALYSIS XXII, PT II, IDA 2024, 2024, 14642 : 83 - 94
  • [37] Broad Federated Meta-Learning of Damaged Objects in Aerial Videos
    Li, Zekai
    Wang, Wenfeng
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 137 (03): : 2881 - 2899
  • [38] Active federated transfer algorithm based on broad learning for fault diagnosis
    Liu, Guokai
    Shen, Weiming
    Gao, Liang
    Kusiak, Andrew
    MEASUREMENT, 2023, 208
  • [39] Network-Level Adversaries in Federated Learning
    Severi, Giorgio
    Jagielski, Matthew
    Yar, Gokberk
    Wang, Yuxuan
    Oprea, Alina
    Nita-Rotaru, Cristina
    2022 IEEE CONFERENCE ON COMMUNICATIONS AND NETWORK SECURITY (CNS), 2022, : 19 - 27
  • [40] Neural network quantization in federated learning at the edge
    Tonellotto, Nicola
    Gotta, Alberto
    Nardini, Franco Maria
    Gadler, Daniele
    Silvestri, Fabrizio
    INFORMATION SCIENCES, 2021, 575 : 417 - 436