Joint Client-and-Sample Selection for Federated Learning via Bi-Level Optimization

被引:0
|
作者
Li, Anran [1 ]
Wang, Guangjing [2 ]
Hu, Ming [3 ]
Sun, Jianfei [3 ]
Zhang, Lan [4 ]
Tuan, Luu Anh [5 ]
Yu, Han [5 ]
机构
[1] Yale Univ, Sch Med, Dept Biomed Informat & Data Sci, New Haven, CT 06520 USA
[2] Univ S Florida, Dept Comp Sci & Engn, Tampa, FL 33620 USA
[3] Singapore Management Univ, Sch Comp & Informat Syst, Singapore 188065, Singapore
[4] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230026, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会; 中央高校基本科研业务费专项资金资助; 国家重点研发计划;
关键词
Training; Computational modeling; Data models; Noise measurement; Noise; Optimization; Servers; Bi-level optimization; federated learning; noisy data detection; sample selection;
D O I
10.1109/TMC.2024.3455331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables massive local data owners to collaboratively train a deep learning model without disclosing their private data. The importance of local data samples from various data owners to FL models varies widely. This is exacerbated by the presence of noisy data that exhibit large losses similar to important (hard) samples. Currently, there lacks an FL approach that can effectively distinguish hard samples (which are beneficial) from noisy samples (which are harmful). To bridge this gap, we propose the joint Federated Meta-Weighting based Client and Sample Selection (FedMW-CSS) approach to simultaneously mitigate label noise and hard sample selection. It is a bilevel optimization approach for FL client-and-sample selection and global model construction to achieve hard sample-aware noise-robust learning in a privacy preserving manner. It performs meta-learning based online approximation to iteratively update global FL models, select the most positively influential samples and deal with training data noise. To utilize both the instance-level information and class-level information for better performance improvements, FedMW-CSS efficiently learns a class-level weight by manipulating gradients at the class level, e.g., it performs a gradient descent step on class-level weights, which only relies on intermediate gradients. Theoretically, we analyze the privacy guarantees and convergence of FedMW-CSS. Extensive experiments comparison against eight state-of-the-art baselines on six real-world datasets in the presence of data noise and heterogeneity shows that FedMW-CSS achieves up to 28.5% higher test accuracy, while saving communication and computation costs by at least 49.3% and 1.2%, respectively.
引用
收藏
页码:15196 / 15209
页数:14
相关论文
共 50 条
  • [1] Addressing Heterogeneity in Federated Learning with Client Selection via Submodular Optimization
    Zhang, Jinghui
    Wang, Jiawei
    Li, Yaning
    Xin, Fa
    Dong, Fang
    Luo, Junzhou
    Wu, Zhihua
    ACM TRANSACTIONS ON SENSOR NETWORKS, 2024, 20 (02)
  • [2] Joint Client Selection and Training Optimization for Energy-Efficient Federated Learning
    Yan, Kang
    Shu, Nina
    Wu, Tao
    Liu, Chunsheng
    Huang, Jun
    Yu, Jingbo
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 849 - 854
  • [3] Provable Representation Learning for Imitation Learning via Bi-level Optimization
    Arora, Sanjeev
    Du, Simon S.
    Kakade, Sham
    Luo, Yuping
    Saunshi, Nikunj
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [4] Joint Unsupervised Learning of Optical Flow and Egomotion with Bi-Level Optimization
    Jiang, Shihao
    Campbell, Dylan
    Liu, Miaomiao
    Gould, Stephen
    Hartley, Richard
    2020 INTERNATIONAL CONFERENCE ON 3D VISION (3DV 2020), 2020, : 682 - 691
  • [5] A bi-level optimization model for technology selection
    Aviso, Kathleen B.
    Chiu, Anthony S. F.
    Ubando, Aristotle T.
    Tan, Raymond R.
    JOURNAL OF INDUSTRIAL AND PRODUCTION ENGINEERING, 2021, 38 (08) : 573 - 580
  • [6] Joint Client Selection and Bandwidth Allocation Algorithm for Federated Learning
    Ko, Haneul
    Lee, Jaewook
    Seo, Sangwon
    Pack, Sangheon
    Leung, Victor C. M.
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (06) : 3380 - 3390
  • [7] LABO: Towards Learning Optimal Label Regularization via Bi-level Optimization
    Lu, Peng
    Rashid, Ahmad
    Kobyzev, Ivan
    Rezagholizadeh, Mehdi
    Langlais, Philippe
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5759 - 5774
  • [8] Review on Research Trends of Optimization for Client Selection in Federated Learning
    Kim, Jaemin
    Song, Chihyun
    Paek, Jeongyeup
    Kwon, Jung-Hyok
    Cho, Sungrae
    38TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN 2024, 2024, : 287 - 289
  • [9] Sample-level Data Selection for Federated Learning
    Li, Anran
    Zhang, Lan
    Tan, Juntao
    Qin, Yaxuan
    Wang, Junhao
    Li, Xiang-Yang
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2021), 2021,
  • [10] Joint Client Selection and Privacy Compensation for Differentially Private Federated Learning
    Xu, Ruichen
    Zhang, Ying-Jun Angela
    Huang, Jianwei
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS, INFOCOM WKSHPS 2024, 2024,