Federated Continuous Learning With Broad Network Architecture

被引:38
|
作者
Le, Junqing [1 ]
Lei, Xinyu [2 ]
Mu, Nankun [3 ,4 ]
Zhang, Hengrun [5 ,6 ]
Zeng, Kai [5 ,6 ]
Liao, Xiaofeng [3 ,4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
[3] Chongqing Univ, Key Lab Dependable Serv Comp Cyber Phys Soc, Minist Educ, Chongqing, Peoples R China
[4] Chongqing Univ, Coll Comp Sci, Chongqing, Peoples R China
[5] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
[6] George Mason Univ, Dept Elect & Comp Engn, Fairfax, VA 22030 USA
基金
中国国家自然科学基金; 中国博士后科学基金; 国家重点研发计划;
关键词
Servers; Computational modeling; Training; Training data; Adaptation models; Data models; Real-time systems; Broad learning (BL); catastrophic forgetting; continuous learning; federated learning (FL);
D O I
10.1109/TCYB.2021.3090260
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a machine-learning setting, where multiple clients collaboratively train a model under the coordination of a central server. The clients' raw data are locally stored, and each client only uploads the trained weight to the server, which can mitigate the privacy risks from the centralized machine learning. However, most of the existing FL models focus on one-time learning without consideration for continuous learning. Continuous learning supports learning from streaming data continuously, so it can adapt to environmental changes and provide better real-time performance. In this article, we present a federated continuous learning scheme based on broad learning (FCL-BL) to support efficient and accurate federated continuous learning (FCL). In FCL-BL, we propose a weighted processing strategy to solve the catastrophic forgetting problem, so FCL-BL can handle continuous learning. Then, we develop a local-independent training solution to support fast and accurate training in FCL-BL. The proposed solution enables us to avoid using a time-consuming synchronous approach while addressing the inaccurate-training issue rooted in the previous asynchronous approach. Moreover, we introduce a batch-asynchronous approach and broad learning (BL) technique to guarantee the high efficiency of FCL-BL. Specifically, the batch-asynchronous approach reduces the number of client-server interaction rounds, and the BL technique supports incremental learning without retraining when learning newly produced data. Finally, theoretical analysis and experimental results further illustrate that FCL-BL is superior to the existing FL schemes in terms of efficiency and accuracy in FCL.
引用
收藏
页码:3874 / 3888
页数:15
相关论文
共 50 条
  • [1] Towards Federated Bayesian Network Structure Learning with Continuous Optimization
    Ng, Ignavier
    Zhang, Kun
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [2] An intelligent native network slicing security architecture empowered by federated learning
    Moreira, Rodrigo
    Villaca, Rodolfo S.
    Ribeiro, Moises R. N.
    Martins, Joberto S. B.
    Correa, Joao Henrique
    Carvalho, Tereza C.
    de Oliveira Silva, Flavio
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 163
  • [3] Distributed Core Network Traffic Prediction Architecture Based on Vertical Federated Learning
    Li, Pengyu
    Guo, Chengwei
    Xing, Yanxia
    Shi, Yingji
    Feng, Lei
    Zhou, Fanqin
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND NETWORKS, VOL III, CENET 2023, 2024, 1127 : 230 - 237
  • [4] A Hybrid Architecture for Federated and Centralized Learning
    Elbir, Ahmet M.
    Coleri, Sinem
    Papazafeiropoulos, Anastasios K.
    Kourtessis, Pandelis
    Chatzinotas, Symeon
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2022, 8 (03) : 1529 - 1542
  • [5] Federated Learning Systems: Architecture Alternatives
    Zhang, Hongyi
    Bosch, Jan
    Olsson, Helena Holmstrom
    2020 27TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC 2020), 2020, : 385 - 394
  • [6] Federated Learning for Distributed NWDAF Architecture
    Rajabzadeh, Parsa
    Outtagarts, Abdelkader
    2023 26TH CONFERENCE ON INNOVATION IN CLOUDS, INTERNET AND NETWORKS AND WORKSHOPS, ICIN, 2023,
  • [7] A Blockchained Incentive Architecture for Federated Learning
    de Brito Goncalves, Joao Paulo
    Villaca, Rodolfo da Silva
    2022 IEEE INTERNATIONAL CONFERENCE ON BLOCKCHAIN (BLOCKCHAIN 2022), 2022, : 482 - 487
  • [8] Java Federated Learning Framework Architecture
    Efremov, Mikhail A.
    Kholod, Ivan I.
    Kolpaschikov, Maxim A.
    Proceedings of the 2021 IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering, ElConRus 2021, 2021, : 306 - 309
  • [9] From federated learning to federated neural architecture search: a survey
    Zhu, Hangyu
    Zhang, Haoyu
    Jin, Yaochu
    COMPLEX & INTELLIGENT SYSTEMS, 2021, 7 (02) : 639 - 657
  • [10] From federated learning to federated neural architecture search: a survey
    Hangyu Zhu
    Haoyu Zhang
    Yaochu Jin
    Complex & Intelligent Systems, 2021, 7 : 639 - 657