Joint Client-and-Sample Selection for Federated Learning via Bi-Level Optimization

被引:0
|
作者
Li, Anran [1 ]
Wang, Guangjing [2 ]
Hu, Ming [3 ]
Sun, Jianfei [3 ]
Zhang, Lan [4 ]
Tuan, Luu Anh [5 ]
Yu, Han [5 ]
机构
[1] Yale Univ, Sch Med, Dept Biomed Informat & Data Sci, New Haven, CT 06520 USA
[2] Univ S Florida, Dept Comp Sci & Engn, Tampa, FL 33620 USA
[3] Singapore Management Univ, Sch Comp & Informat Syst, Singapore 188065, Singapore
[4] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230026, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会; 中央高校基本科研业务费专项资金资助; 国家重点研发计划;
关键词
Training; Computational modeling; Data models; Noise measurement; Noise; Optimization; Servers; Bi-level optimization; federated learning; noisy data detection; sample selection;
D O I
10.1109/TMC.2024.3455331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables massive local data owners to collaboratively train a deep learning model without disclosing their private data. The importance of local data samples from various data owners to FL models varies widely. This is exacerbated by the presence of noisy data that exhibit large losses similar to important (hard) samples. Currently, there lacks an FL approach that can effectively distinguish hard samples (which are beneficial) from noisy samples (which are harmful). To bridge this gap, we propose the joint Federated Meta-Weighting based Client and Sample Selection (FedMW-CSS) approach to simultaneously mitigate label noise and hard sample selection. It is a bilevel optimization approach for FL client-and-sample selection and global model construction to achieve hard sample-aware noise-robust learning in a privacy preserving manner. It performs meta-learning based online approximation to iteratively update global FL models, select the most positively influential samples and deal with training data noise. To utilize both the instance-level information and class-level information for better performance improvements, FedMW-CSS efficiently learns a class-level weight by manipulating gradients at the class level, e.g., it performs a gradient descent step on class-level weights, which only relies on intermediate gradients. Theoretically, we analyze the privacy guarantees and convergence of FedMW-CSS. Extensive experiments comparison against eight state-of-the-art baselines on six real-world datasets in the presence of data noise and heterogeneity shows that FedMW-CSS achieves up to 28.5% higher test accuracy, while saving communication and computation costs by at least 49.3% and 1.2%, respectively.
引用
收藏
页码:15196 / 15209
页数:14
相关论文
共 50 条
  • [31] Learning Koopman Operators with Control Using Bi-level Optimization
    Huang, Daning
    Prasetyo, Muhammad Bayu
    Yu, Yin
    Geng, Junyi
    2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 2147 - 2152
  • [32] The Bi-Level Particle Swarm Optimization for Joint Pricing in a Supply Chain
    Mansyuri, Umar
    Panudju, Andreas Tri
    Sitorus, Helena
    Spalanzani, Widya
    Nurhasanah, Nunung
    Khaerudin, Dedy
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (04) : 745 - 753
  • [33] Task-aware world model learning with meta weighting via bi-level optimization
    Yuan, Huining
    Dou, Hongkun
    Jiang, Xingyu
    Deng, Yue
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [34] Bi-Level Spectral Feature Selection
    Hu, Zebiao
    Wang, Jian
    Zhang, Kai
    Pedrycz, Witold
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [35] Bi-level variable selection via adaptive sparse group Lasso
    Fang, Kuangnan
    Wang, Xiaoyan
    Zhang, Shengwei
    Zhu, Jianping
    Ma, Shuangge
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2015, 85 (13) : 2750 - 2760
  • [36] Client Selection with Bandwidth Allocation in Federated Learning
    Kuang, Junqian
    Yang, Miao
    Zhu, Hongbin
    Qian, Hua
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [37] Bi-level gene selection of cancer by combining clustering and sparse learning
    Chen J.
    Wen B.
    Computers in Biology and Medicine, 2024, 172
  • [38] Towards Client Selection in Satellite Federated Learning
    Wu, Changhao
    He, Siyang
    Yin, Zengshan
    Guo, Chongbin
    APPLIED SCIENCES-BASEL, 2024, 14 (03):
  • [39] A review on client selection models in federated learning
    Panigrahi, Monalisa
    Bharti, Sourabh
    Sharma, Arun
    WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 13 (06)
  • [40] Active Client Selection for Clustered Federated Learning
    Huang, Honglan
    Shi, Wei
    Feng, Yanghe
    Niu, Chaoyue
    Cheng, Guangquan
    Huang, Jincai
    Liu, Zhong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16424 - 16438