AdaptFL: Adaptive Federated Learning Framework for Heterogeneous Devices

被引:0
|
作者
Zhang, Yingqi [1 ]
Xia, Hui [1 ]
Xu, Shuo [1 ]
Wang, Xiangxiang [1 ]
Xu, Lijuan [2 ]
机构
[1] Ocean Univ China, Coll Comp Sci & Technol, Qingdao 266100, Peoples R China
[2] Qilu Univ Technol, Shandong Acad Sci, Shandong Comp Sci Ctr, Natl Supercomp Ctr Jinan,Key Lab Comp Power Networ, Jinan 250014, Peoples R China
来源
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE | 2025年 / 165卷
关键词
Federated learning; Heterogeneous device; Neural architecture search; Knowledge distillation;
D O I
10.1016/j.future.2024.107610
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
With the development of the Internet of Things (IoT), Federated Learning (FL) is extensively employed in smart cities and industrial IoT, involving numerous heterogeneous devices with varying computational and storage capabilities. Traditional FL assumes that clients have enough resources to train a unified global model from the beginning to the end of training. However, it ignores the problem of uneven and real-time changes in client resources. Additionally, there are aggregation difficulties between heterogeneous client models and global model. To address these challenges, we propose an Adaptive Federated Learning Framework for Heterogeneous Devices (AdaptFL). In AdaptFL, we employ a resource-aware neural architecture search method, which searches for models based on each client's resource conditions. It enables AdaptFL to automatically assign customized models tailored to each client's specific resource conditions in the current round. Additionally, we employ a staged knowledge distillation strategy to facilitate efficient distribution and aggregation between the heterogeneous global model and the client models. Experimental results demonstrate that, compared to stateof-the-art model-level heterogeneous ablation methods, AdaptFL improves global test accuracy by 4% to 15% on the SVHN dataset and enhances accuracy by 5% to 14% in scenarios with heterogeneous data. Additionally, AdaptFL effectively reduces communication overhead by over 50% across all datasets. Furthermore, it offers a degree of resilience against model poisoning attacks.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] ScaleFL: Resource-Adaptive Federated Learning with Heterogeneous Clients
    Ilhan, Fatih
    Su, Gong
    Liu, Ling
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24532 - 24541
  • [32] An Adaptive Kernel Approach to Federated Learning of Heterogeneous Causal Effects
    Vo, Thanh Vinh
    Bhattacharyya, Arnab
    Lee, Young
    Leong, Tze-Yun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [33] Ferrari: A Personalized Federated Learning Framework for Heterogeneous Edge Clients
    Yao, Zhiwei
    Liu, Jianchun
    Xu, Hongli
    Wang, Lun
    Qian, Chen
    Liao, Yunming
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 10031 - 10045
  • [34] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Gong, Biyao
    Xing, Tianzhang
    Liu, Zhidan
    Wang, Junfeng
    Liu, Xiuya
    Mobile Networks and Applications, 2022, 27 (04): : 1520 - 1530
  • [35] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [36] Adaptive client selection and model aggregation for heterogeneous federated learning
    Zhai, Rui
    Jin, Haozhe
    Gong, Wei
    Lu, Ke
    Liu, Yanhong
    Song, Yalin
    Yu, Junyang
    MULTIMEDIA SYSTEMS, 2024, 30 (04)
  • [37] Dual Adaptive Compression for Efficient Communication in Heterogeneous Federated Learning
    Mao, Yingchi
    Wang, Zibo
    Li, Chenxin
    Zhang, Jiakai
    Xu, Shufang
    Wu, Jie
    2024 IEEE 24TH INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING, CCGRID 2024, 2024, : 236 - 244
  • [38] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Gong, Biyao
    Xing, Tianzhang
    Liu, Zhidan
    Wang, Junfeng
    Liu, Xiuya
    MOBILE NETWORKS & APPLICATIONS, 2022, 27 (04): : 1520 - 1530
  • [39] FedDA: Resource-adaptive federated learning with dual-alignment aggregation optimization for heterogeneous edge devices
    Cao, Shaohua
    Wu, Huixin
    Wu, Xiwen
    Ma, Ruhui
    Wang, Danxin
    Han, Zhu
    Zhang, Weishan
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 163
  • [40] Making resource adaptive to federated learning with COTS mobile devices
    Yongheng Deng
    Shuang Gu
    Chengbo Jiao
    Xing Bao
    Feng Lyu
    Peer-to-Peer Networking and Applications, 2022, 15 : 1214 - 1231