AdaptFL: Adaptive Federated Learning Framework for Heterogeneous Devices

被引:0
|
作者
Zhang, Yingqi [1 ]
Xia, Hui [1 ]
Xu, Shuo [1 ]
Wang, Xiangxiang [1 ]
Xu, Lijuan [2 ]
机构
[1] Ocean Univ China, Coll Comp Sci & Technol, Qingdao 266100, Peoples R China
[2] Qilu Univ Technol, Shandong Acad Sci, Shandong Comp Sci Ctr, Natl Supercomp Ctr Jinan,Key Lab Comp Power Networ, Jinan 250014, Peoples R China
来源
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE | 2025年 / 165卷
关键词
Federated learning; Heterogeneous device; Neural architecture search; Knowledge distillation;
D O I
10.1016/j.future.2024.107610
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
With the development of the Internet of Things (IoT), Federated Learning (FL) is extensively employed in smart cities and industrial IoT, involving numerous heterogeneous devices with varying computational and storage capabilities. Traditional FL assumes that clients have enough resources to train a unified global model from the beginning to the end of training. However, it ignores the problem of uneven and real-time changes in client resources. Additionally, there are aggregation difficulties between heterogeneous client models and global model. To address these challenges, we propose an Adaptive Federated Learning Framework for Heterogeneous Devices (AdaptFL). In AdaptFL, we employ a resource-aware neural architecture search method, which searches for models based on each client's resource conditions. It enables AdaptFL to automatically assign customized models tailored to each client's specific resource conditions in the current round. Additionally, we employ a staged knowledge distillation strategy to facilitate efficient distribution and aggregation between the heterogeneous global model and the client models. Experimental results demonstrate that, compared to stateof-the-art model-level heterogeneous ablation methods, AdaptFL improves global test accuracy by 4% to 15% on the SVHN dataset and enhances accuracy by 5% to 14% in scenarios with heterogeneous data. Additionally, AdaptFL effectively reduces communication overhead by over 50% across all datasets. Furthermore, it offers a degree of resilience against model poisoning attacks.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] An adaptive asynchronous federated learning framework for heterogeneous Internet of things
    Zhang, Weidong
    Deng, Dongshang
    Wu, Xuangou
    Zhao, Wei
    Liu, Zhi
    Zhang, Tao
    Kang, Jiawen
    Niyato, Dusit
    INFORMATION SCIENCES, 2025, 689
  • [2] Grouped Federated Learning: A Decentralized Learning Framework with Low Latency for Heterogeneous Devices
    Yin, Tong
    Li, Lixin
    Lin, Wensheng
    Ma, Donghui
    Han, Zhu
    2022 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS (ICC WORKSHOPS), 2022, : 55 - 60
  • [3] Accelerating Wireless Federated Learning With Adaptive Scheduling Over Heterogeneous Devices
    Li, Yixuan
    Qin, Xiaoqi
    Han, Kaifeng
    Ma, Nan
    Xu, Xiaodong
    Zhang, Ping
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (02) : 2286 - 2302
  • [4] FedSEA: A Semi-Asynchronous Federated Learning Framework for Extremely Heterogeneous Devices
    Sun, Jingwei
    Li, Ang
    Duan, Lin
    Alam, Samiul
    Deng, Xuliang
    Guo, Xin
    Wang, Haiming
    Gorlatova, Maria
    Zhang, Mi
    Li, Hai
    Chen, Yiran
    PROCEEDINGS OF THE TWENTIETH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2022, 2022, : 106 - 119
  • [5] FedSAE: A Novel Self-Adaptive Federated Learning Framework in Heterogeneous Systems
    Li, Li
    Duan, Moming
    Liu, Duo
    Zhang, Yu
    Ren, Ao
    Chen, Xianzhang
    Tan, Yujuan
    Wang, Chengliang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] FedMEM: Adaptive Personalized Federated Learning Framework for Heterogeneous Mobile Edge Environments
    Chen Ximing
    He Xilong
    Cheng Du
    Wu Tiejun
    Tian Qingyu
    Chen Rongrong
    Qiu Jing
    International Journal of Computational Intelligence Systems, 18 (1)
  • [7] Asynchronous federated learning on heterogeneous devices: A survey
    Xu, Chenhao
    Qu, Youyang
    Xiang, Yong
    Gao, Longxiang
    COMPUTER SCIENCE REVIEW, 2023, 50
  • [8] Asynchronous Decentralized Federated Learning for Heterogeneous Devices
    Liao, Yunming
    Xu, Yang
    Xu, Hongli
    Chen, Min
    Wang, Lun
    Qiao, Chunming
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (05) : 4535 - 4550
  • [9] A Superquantile Approach to Federated Learning with Heterogeneous Devices
    Laguel, Yassine
    Pillutla, Krishna
    Malick, Jerome
    Harchaoui, Zaid
    2021 55TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2021,
  • [10] Optimizing Federated Learning with Heterogeneous Edge Devices
    Islam, Mohammad Munzurul
    Alawad, Mohammed
    2024 IEEE 3RD INTERNATIONAL CONFERENCE ON COMPUTING AND MACHINE INTELLIGENCE, ICMI 2024, 2024,