FedDSS: A data-similarity approach for client selection in horizontal federated learning

被引:0
|
作者
Nguyen, Tuong Minh [1 ]
Poh, Kim Leng [1 ]
Chong, Shu-Ling [2 ]
Lee, Jan Hau [3 ,4 ]
机构
[1] Natl Univ Singapore, Dept Ind Syst Engn & Management, Singapore 117576, Singapore
[2] KK Womens & Childrens Hosp, Childrens Emergency, Singapore 229899, Singapore
[3] Duke NUS Med Sch, SingHlth Duke NUS Paediat Acad Clin Programme, Singapore 169857, Singapore
[4] KK Womens & Childrens Hosp, Childrens Intens Care Unit, Singapore 229899, Singapore
关键词
Federated learning; Non-i.i.d; Client selection; Data similarity; Pediatric sepsis;
D O I
10.1016/j.ijmedinf.2024.105650
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Background and objective: Federated learning (FL) is an emerging distributed learning framework allowing multiple clients (hospitals, institutions, smart devices, etc.) to collaboratively train a centralized machine learning model without disclosing personal data. It has the potential to address several healthcare challenges, including a lack of training data, data privacy, and security concerns. However, model learning under FL is affected by non-i.i.d. data, leading to severe model divergence and reduced performance due to the varying client's data distributions. To address this problem, we propose FedDSS, Federated Data Similarity Selection, a framework that uses a data-similarity approach to select clients, without compromising client data privacy. Methods: FedDSS comprises a statistical-based data similarity metric, a N-similar-neighbor network, and a network-based selection strategy. We assessed FedDSS' performance against FedAvg's in i.i.d. and non-i.i.d. settings with two public pediatric sepsis datasets (PICD and MIMICIII). Selection fairness was measured using entropy. . Simulations were repeated five times to evaluate average loss, true positive rate (TPR), and entropy. . Results: In i.i.d setting on PICD, FedDSS achieved a higher TPR starting from the 9th round and surpassing 0.6 three rounds earlier than FedAvg. On MIMICIII, FedDSS's loss decreases significantly from the 13th round, with TPR > 0.8 by the 2nd round, two rounds ahead of FedAvg (at the 4th round). In the non-i.i.d. setting, FedDSS achieved TPR > 0.7 by the 4th and > 0.8 by the 7th round, earlier than FedAvg (at the 5th and 11th rounds). In both settings, FedDSS showed reasonable fairness ( entropy of 2.2 and 2.1). Conclusion: We demonstrated that FedDSS contributes to improved learning in FL by achieving faster convergence, reaching the desired TPR with fewer communication rounds, and potentially enhancing sepsis prediction (TPR) over FedAvg.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Federated learning client selection algorithm based on gradient similarity
    Hu, Lingxi
    Hu, Yuanyuan
    Jiang, Linhua
    Long, Wei
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2025, 28 (02):
  • [2] Towards Instant Clustering Approach for Federated Learning Client Selection
    Arisdakessian, Sarhad
    Wahab, Omar Abdel
    Mourad, Azzam
    Otrok, Hadi
    2023 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC, 2023, : 409 - 413
  • [3] Client Selection for Wireless Federated Learning With Data and Latency Heterogeneity
    Chen, Xiaobing
    Zhou, Xiangwei
    Zhang, Hongchao
    Sun, Mingxuan
    Vincent Poor, H.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (19): : 32183 - 32196
  • [4] Client Selection in Hierarchical Federated Learning
    Trindade, Silvana
    da Fonseca, Nelson L. S.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28480 - 28495
  • [5] Client Selection for Federated Bayesian Learning
    Yang, Jiarong
    Liu, Yuan
    Kassab, Rahif
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 915 - 928
  • [6] Incentive Design for Heterogeneous Client Selection: A Robust Federated Learning Approach
    Pene, Papa
    Liao, Weixian
    Yu, Wei
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (04): : 5939 - 5950
  • [7] Data Quality-Aware Client Selection in Heterogeneous Federated Learning
    Song, Shinan
    Li, Yaxin
    Wan, Jin
    Fu, Xianghua
    Jiang, Jingyan
    MATHEMATICS, 2024, 12 (20)
  • [8] Optimizing Client and Data Selection in Federated Learning: A Centralized Optimization and Decentralized Game-Theoretic Approach
    Lin, Junkun
    Luo, Jingjing
    Wang, Tong
    Gao, Lin
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5256 - 5261
  • [9] EFFICIENT CLIENT CONTRIBUTION EVALUATION FOR HORIZONTAL FEDERATED LEARNING
    Zhao, Jie
    Zhu, Xinghua
    Wang, Jianzong
    Xiao, Jing
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3060 - 3064
  • [10] Client Selection with Bandwidth Allocation in Federated Learning
    Kuang, Junqian
    Yang, Miao
    Zhu, Hongbin
    Qian, Hua
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,