Adaptive Upgrade of Client Resources for Improving the Quality of Federated Learning Model

被引:16
|
作者
AbdulRahman, Sawsan [1 ]
Ould-Slimane, Hakima [2 ]
Chowdhury, Rasel [1 ]
Mourad, Azzam [3 ,4 ]
Talhi, Chamseddine [1 ]
Guizani, Mohsen [5 ]
机构
[1] Ecole Technol Sup, Dept Software Engn & IT, Montreal, PQ H3C 1K3, Canada
[2] Universi Quebec Trois Rivieres, Dept Math & Comp Sci, Trois Rivieres, PQ G8Z 4M3, Canada
[3] Lebanese Amer Univ, Cyber Secur Syst & Appl AI Res Ctr, Dept CSM, Beirut, Lebanon
[4] New York Univ Abu Dhabi, Div Sci, Abu Dhabi, U Arab Emirates
[5] Mohamed Bin Zayed Univ Artificial Intelligence, Dept Machine Learning, Abu Dhabi, U Arab Emirates
关键词
Data models; Servers; Internet of Things; Adaptation models; Performance evaluation; Computational modeling; Training; Client selection; federated learning (FL); Internet of Things (IoT); Kubernetes; model significance; resource allocation; COMMUNICATION;
D O I
10.1109/JIOT.2022.3218755
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Conventional systems are usually constrained to store data in a centralized location. This restriction has either precluded sensitive data from being shared or put its privacy on the line. Alternatively, federated learning (FL) has emerged as a promising privacy-preserving paradigm for exchanging model parameters instead of private data of Internet of Things (IoT) devices known as clients. FL trains a global model by communicating local models generated by selected clients throughout many communication rounds until ensuring high learning performance. In these settings, the FL performance highly depends on selecting the best available clients. This process is strongly related to the quality of their models and their training data. Such selection-based schemes have not been explored yet, particularly regarding participating clients having high-quality data yet with limited resources. To address these challenges, we propose in this article FedAUR, a novel approach for an adaptive upgrade of clients resources in FL. We first introduce a method to measure how a locally generated model affects and improves the global model if selected for aggregation without revealing raw data. Next, based on the significance of each client parameters and the resources of their devices, we design a selection scheme that manages and distributes available resources on the server among the appropriate subset of clients. This client selection and resource allocation problem is thus formulated as an optimization problem, where the purpose is to discover and train in each round the maximum number of samples with the highest quality in order to target the desired performance. Moreover, we present a Kubernetes-based prototype that we implemented to evaluate the performance of the proposed approach.
引用
收藏
页码:4677 / 4687
页数:11
相关论文
共 50 条
  • [31] Stabilizing and improving federated learning with highly non-iid data and client dropout
    Xu, Jian
    Yang, Meilin
    Ding, Wenbo
    Huang, Shao-Lun
    APPLIED INTELLIGENCE, 2025, 55 (03)
  • [32] SDN-Assisted Client Selection to Enhance the Quality of Federated Learning Processes
    Mahmod, Ahmad
    Pace, Pasquale
    Iera, Antonio
    2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [33] Beyond ADMM: A Unified Client-Variance-Reduced Adaptive Federated Learning Framework
    Wang, Shuai
    Xu, Yanqing
    Wang, Zhiguo
    Chang, Tsung-Hui
    Quek, Tony Q. S.
    Sun, Defeng
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 10175 - 10183
  • [34] Towards Communication-Efficient Federated Graph Learning: An Adaptive Client Selection Perspective
    Gao, Xianjun
    Liu, Jianchun
    Xu, Hongli
    Mai, Qianpiao
    Wang, Lun
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [35] ADAPTIVE MODEL AGGREGATION IN FEDERATED LEARNING BASED ON MODEL ACCURACY
    Wang, Rebekah
    Chen, Yingying
    IEEE WIRELESS COMMUNICATIONS, 2024, : 1 - 7
  • [36] An Efficient Differential Privacy Federated Learning Scheme with Optimal Adaptive Client Number K
    Wang, Jian
    Zhang, Mengwei
    Proceedings of SPIE - The International Society for Optical Engineering, 2023, 12587
  • [37] A Hierarchical Federated Learning Model with Adaptive Model Parameter Aggregation
    Chen, Zhuo
    Zhou, Chuan
    Zhou, Yang
    COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2023, 20 (03) : 1037 - 1060
  • [38] SpreadFGL: Edge-Client Collaborative Federated Graph Learning with Adaptive Neighbor Generation
    Zhong, Luying
    Pi, Yueyang
    Chen, Zheyi
    Yu, Zhengxin
    Miao, Wang
    Chen, Xing
    Min, Geyong
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2024, : 1141 - 1150
  • [39] Adaptive Model Aggregation in Federated Learning Based on Model Accuracy
    Wang, Rebekah
    Chen, Yingying
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 200 - 206
  • [40] APCSMA: Adaptive Personalized Client-Selection and Model-Aggregation Algorithm for Federated Learning in Edge Computing Scenarios
    Ma, Xueting
    Ma, Guorui
    Liu, Yang
    Qi, Shuhan
    ENTROPY, 2024, 26 (08)