Towards Efficient Learning on the Computing Continuum: Advancing Dynamic Adaptation of Federated Learning

被引:0
|
作者
Valli, Mathis [1 ]
Costan, Alexandru [1 ]
Tedeschi, Cedric [1 ]
Cudennec, Loic [2 ]
机构
[1] Univ Rennes, IRISA, CNRS, INRIA, Rennes, France
[2] DGA Maitrise Informat, Rennes, France
关键词
federated learning; dynamic adaptation; computing continuum; machine learning; data privacy;
D O I
10.1145/3659995.3660042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Learning (FL) has emerged as a paradigm shift enabling heterogeneous clients and devices to collaborate on training a shared global model while preserving the privacy of their local data. However, a common yet impractical assumption in existing FL approaches is that the deployment environment is static, which is rarely true in heterogeneous and highly-volatile environments like the Edge-Cloud Continuum, where FL is typically executed. While most of the current FL approaches process data in an online fashion, and are therefore adaptive by nature, they only support adaptation at the ML/DL level (e.g., through continual learning to tackle data and concept drift), putting aside the effects of system variance. Moreover, the study and validation of FL approaches strongly rely on simulations, which, although informative, tends to overlook the real-world complexities and dynamics of actual deployments, in particular with respect to changing network conditions, varying client resources, and security threats. In this paper we make a first step to address these challenges. We investigate the shortcomings of traditional, static FL models and identify areas of adaptation to tackle real-life deployment challenges. We devise a set of design principles for FL systems that can smartly adjust their strategies for aggregation, communication, privacy, and security in response to changing system conditions. To illustrate the benefits envisioned by these strategies, we present the results of a set of initial experiments on a 25-node testbed. The experiments, which vary both the number of participating clients and the network conditions, show how existing FL systems are strongly affected by changes in their operational environment. Based on these insights, we propose a set of take-aways for the FL community, towards further research into FL systems that are not only accurate and scalable but also able to dynamically adapt to the real-world deployment unpredictability.
引用
收藏
页码:34 / 41
页数:8
相关论文
共 50 条
  • [21] Coded Federated Learning for Communication-Efficient Edge Computing: A Survey
    Zhang, Yiqian
    Gao, Tianli
    Li, Congduan
    Tan, Chee Wei
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 4098 - 4124
  • [22] Resource-Efficient Federated Learning with Hierarchical Aggregation in Edge Computing
    Wang, Zhiyuan
    Xu, Hongli
    Liu, Jianchun
    Huang, He
    Qiao, Chunming
    Zhao, Yangming
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2021), 2021,
  • [23] Toward Resource-Efficient Federated Learning in Mobile Edge Computing
    Yu, Rong
    Li, Peichun
    IEEE NETWORK, 2021, 35 (01): : 148 - 155
  • [24] Energy-Efficient Federated Learning for Wireless Computing Power Networks
    Li, Zongjun
    Zhang, Haibin
    Wang, Qubeijian
    Sun, Wen
    Zhang, Yan
    2022 IEEE 95TH VEHICULAR TECHNOLOGY CONFERENCE (VTC2022-SPRING), 2022,
  • [25] Federated Learning for Energy-Efficient Task Computing in Wireless Networks
    Wang, Sihua
    Chen, Mingzhe
    Saad, Walid
    Yin, Changchuan
    ICC 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2020,
  • [26] Highly efficient federated learning with strong privacy preservation in cloud computing
    Fang, Chen
    Guo, Yuanbo
    Wang, Na
    Ju, Ankang
    COMPUTERS & SECURITY, 2020, 96
  • [27] Client-Customized Adaptation for Parameter-Efficient Federated Learning
    Kim, Yeachan
    Kim, Junho
    Mok, Wing-Lam
    Park, Jun-Hyung
    Lee, SangKeun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 1159 - 1172
  • [28] Efficient Personalized Federated Learning via Sparse Model-Adaptation
    Chen, Daoyuan
    Yao, Liuyi
    Gao, Dawei
    Ding, Bolin
    Li, Yaliang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [29] Dynamic Sample Selection for Federated Learning with Heterogeneous Data in Fog Computing
    Cai, Lingshuang
    Lin, Di
    Zhang, Jiale
    Yu, Shui
    ICC 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2020,
  • [30] Towards Fair Federated Learning
    Zhou, Zirui
    Chu, Lingyang
    Liu, Changxin
    Wang, Lanjun
    Pei, Jian
    Zhang, Yong
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 4100 - 4101