Adaptive Upgrade of Client Resources for Improving the Quality of Federated Learning Model

被引:16
|
作者
AbdulRahman, Sawsan [1 ]
Ould-Slimane, Hakima [2 ]
Chowdhury, Rasel [1 ]
Mourad, Azzam [3 ,4 ]
Talhi, Chamseddine [1 ]
Guizani, Mohsen [5 ]
机构
[1] Ecole Technol Sup, Dept Software Engn & IT, Montreal, PQ H3C 1K3, Canada
[2] Universi Quebec Trois Rivieres, Dept Math & Comp Sci, Trois Rivieres, PQ G8Z 4M3, Canada
[3] Lebanese Amer Univ, Cyber Secur Syst & Appl AI Res Ctr, Dept CSM, Beirut, Lebanon
[4] New York Univ Abu Dhabi, Div Sci, Abu Dhabi, U Arab Emirates
[5] Mohamed Bin Zayed Univ Artificial Intelligence, Dept Machine Learning, Abu Dhabi, U Arab Emirates
关键词
Data models; Servers; Internet of Things; Adaptation models; Performance evaluation; Computational modeling; Training; Client selection; federated learning (FL); Internet of Things (IoT); Kubernetes; model significance; resource allocation; COMMUNICATION;
D O I
10.1109/JIOT.2022.3218755
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Conventional systems are usually constrained to store data in a centralized location. This restriction has either precluded sensitive data from being shared or put its privacy on the line. Alternatively, federated learning (FL) has emerged as a promising privacy-preserving paradigm for exchanging model parameters instead of private data of Internet of Things (IoT) devices known as clients. FL trains a global model by communicating local models generated by selected clients throughout many communication rounds until ensuring high learning performance. In these settings, the FL performance highly depends on selecting the best available clients. This process is strongly related to the quality of their models and their training data. Such selection-based schemes have not been explored yet, particularly regarding participating clients having high-quality data yet with limited resources. To address these challenges, we propose in this article FedAUR, a novel approach for an adaptive upgrade of clients resources in FL. We first introduce a method to measure how a locally generated model affects and improves the global model if selected for aggregation without revealing raw data. Next, based on the significance of each client parameters and the resources of their devices, we design a selection scheme that manages and distributes available resources on the server among the appropriate subset of clients. This client selection and resource allocation problem is thus formulated as an optimization problem, where the purpose is to discover and train in each round the maximum number of samples with the highest quality in order to target the desired performance. Moreover, we present a Kubernetes-based prototype that we implemented to evaluate the performance of the proposed approach.
引用
收藏
页码:4677 / 4687
页数:11
相关论文
共 50 条
  • [41] Adaptive Client-Dropping in Federated Learning: Preserving Data Integrity in Medical Domains
    Negrao, Arthur
    Silva, Guilherme
    Pedrosa, Rodrigo
    Luz, Eduardo
    Silva, Pedro
    INTELLIGENT SYSTEMS, BRACIS 2024, PT I, 2025, 15412 : 111 - 126
  • [42] Efficient Federated Learning with Adaptive Client-Side Hyper-Parameter Optimization
    Kundroo, Majid
    Kim, Taehong
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 973 - 974
  • [43] On Adaptive Client/Miner Selection for Efficient Blockchain-Based Decentralized Federated Learning
    Tomimasu, Yuta
    Sato, Koya
    2023 IEEE 98TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2023-FALL, 2023,
  • [44] On Improving Quality of the Decision Making Process in a Federated Learning System
    Encheva, Sylvia
    Tumin, Sharil
    COOPERATIVE DESIGN, VISUALIZATION, AND ENGINEERING, PROCEEDINGS, 2008, 5220 : 192 - +
  • [45] Adaptive Model Pruning for Hierarchical Wireless Federated Learning
    Liu, Xiaonan
    Wang, Shiqiang
    Deng, Yansha
    Nallanathan, Arumugam
    2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [46] Efficient Wireless Federated Learning with Adaptive Model Pruning
    Chen, Zhixiong
    Yi, Wenqiang
    Lambotharan, Sangarapillai
    Nallanathan, Arumugam
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 7592 - 7597
  • [47] An Adaptive Model Averaging Procedure for Federated Learning (AdaFed)
    Giuseppi, Alessandro
    Della Torre, Lucrezia
    Menegatti, Danilo
    Priscoli, Francesco Delli
    Pietrabissa, Antonio
    Poli, Cecilia
    JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2022, 13 (06) : 539 - 548
  • [48] AFL: Adaptive Federated Learning Based on Personalized Model and Adaptive Communication
    Wu, Xing
    Liu, Fei Xiang
    Zhao, Yue
    Zhao, Ming
    NEW TRENDS IN INTELLIGENT SOFTWARE METHODOLOGIES, TOOLS AND TECHNIQUES, 2021, 337 : 359 - 366
  • [49] Client Selection with Bandwidth Allocation in Federated Learning
    Kuang, Junqian
    Yang, Miao
    Zhu, Hongbin
    Qian, Hua
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [50] Online Client Scheduling for Fast Federated Learning
    Xu, Bo
    Xia, Wenchao
    Zhang, Jun
    Quek, Tony Q. S.
    Zhu, Hongbo
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2021, 10 (07) : 1434 - 1438