Federated Continuous Learning With Broad Network Architecture

被引:38
|
作者
Le, Junqing [1 ]
Lei, Xinyu [2 ]
Mu, Nankun [3 ,4 ]
Zhang, Hengrun [5 ,6 ]
Zeng, Kai [5 ,6 ]
Liao, Xiaofeng [3 ,4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
[3] Chongqing Univ, Key Lab Dependable Serv Comp Cyber Phys Soc, Minist Educ, Chongqing, Peoples R China
[4] Chongqing Univ, Coll Comp Sci, Chongqing, Peoples R China
[5] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
[6] George Mason Univ, Dept Elect & Comp Engn, Fairfax, VA 22030 USA
基金
中国国家自然科学基金; 中国博士后科学基金; 国家重点研发计划;
关键词
Servers; Computational modeling; Training; Training data; Adaptation models; Data models; Real-time systems; Broad learning (BL); catastrophic forgetting; continuous learning; federated learning (FL);
D O I
10.1109/TCYB.2021.3090260
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a machine-learning setting, where multiple clients collaboratively train a model under the coordination of a central server. The clients' raw data are locally stored, and each client only uploads the trained weight to the server, which can mitigate the privacy risks from the centralized machine learning. However, most of the existing FL models focus on one-time learning without consideration for continuous learning. Continuous learning supports learning from streaming data continuously, so it can adapt to environmental changes and provide better real-time performance. In this article, we present a federated continuous learning scheme based on broad learning (FCL-BL) to support efficient and accurate federated continuous learning (FCL). In FCL-BL, we propose a weighted processing strategy to solve the catastrophic forgetting problem, so FCL-BL can handle continuous learning. Then, we develop a local-independent training solution to support fast and accurate training in FCL-BL. The proposed solution enables us to avoid using a time-consuming synchronous approach while addressing the inaccurate-training issue rooted in the previous asynchronous approach. Moreover, we introduce a batch-asynchronous approach and broad learning (BL) technique to guarantee the high efficiency of FCL-BL. Specifically, the batch-asynchronous approach reduces the number of client-server interaction rounds, and the BL technique supports incremental learning without retraining when learning newly produced data. Finally, theoretical analysis and experimental results further illustrate that FCL-BL is superior to the existing FL schemes in terms of efficiency and accuracy in FCL.
引用
收藏
页码:3874 / 3888
页数:15
相关论文
共 50 条
  • [21] Federated Learning for Network Traffic Prediction
    Behera, Sadananda
    Panda, Saroj Kumar
    Panayiotou, Tania
    Ellinas, Georgios
    2024 23RD IFIP NETWORKING CONFERENCE, IFIP NETWORKING 2024, 2024, : 781 - 785
  • [22] Network Update Compression for Federated Learning
    Kathariya, Birendra
    Li, Li
    Li, Zhu
    Duan, Lingyu
    Liu, Shan
    2020 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (VCIP), 2020, : 38 - 41
  • [23] Federated Continuous Learning Based on Stacked Broad Learning System Assisted by Digital Twin Networks: An Incremental Learning Approach for Intrusion Detection in UAV Networks
    He, Xiaoqiang
    Chen, Qianbin
    Tang, Lun
    Wang, Weili
    Liu, Tong
    Li, Li
    Liu, Qinghai
    Luo, Jia
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (22) : 19825 - 19838
  • [24] A network architecture for continuous mobility
    Uehara, K
    Tateoka, T
    Watanabe, Y
    Sunahara, H
    Nakamura, O
    Murai, J
    WORLDWIDE COMPUTING AND ITS APPLICATIONS - WWCA'98, 1998, 1368 : 254 - 269
  • [25] A federated peer-to-peer network game architecture
    Rooney, S
    Bauer, DB
    Deydier, R
    IEEE COMMUNICATIONS MAGAZINE, 2004, 42 (05) : 114 - 122
  • [26] The western identification Network: Identification as a service in a federated architecture
    Konecny, Roger
    NEC Technical Journal, 2019, 13 (02): : 28 - 32
  • [27] Network-aware federated neural architecture search
    Ocal, Goktug
    Ozgovde, Atay
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 162
  • [28] Federated Learning for Healthcare: Systematic Review and Architecture Proposal
    Antunes, Rodolfo Stoffel
    da Costa, Cristiano Andre
    Kuederle, Arne
    Yari, Imrana Abdullahi
    Eskofier, Bjoern
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [29] An Architecture for Resilient Federated Learning Through Parallel Recognition
    Kim, Jeongeun
    Jeong, Youngwoo
    Jang, Suyeon
    Lee, Seung Eun
    PROCEEDINGS OF THE 2022 31ST INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES, PACT 2022, 2022, : 546 - 547
  • [30] Crowdsourced Federated Learning Architecture with Personalized Privacy Preservation
    Xu, Yunfan
    Qiu, Xuesong
    Zhang, Fan
    Hao, Jiakai
    Intelligent and Converged Networks, 2024, 5 (03): : 192 - 206