PFEDEDIT: Personalized Federated Learning via Automated Model Editing

被引:0
|
作者
Yuan, Haolin [1 ]
Paul, William [2 ]
Aucott, John [3 ]
Burlina, Philippe [2 ]
Cao, Yinzhi [1 ]
机构
[1] Johns Hopkins Univ, Baltimore, MD 21218 USA
[2] Johns Hopkins Appl Phys Lab, Laurel, MD USA
[3] Johns Hopkins Univ, Sch Med, Baltimore, MD USA
来源
基金
美国国家科学基金会;
关键词
D O I
10.1007/978-3-031-72986-7_6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) allows clients to train a deep learning model collaboratively while maintaining their private data locally. One challenging problem facing FL is that the model utility drops significantly once the data distribution gets heterogeneous, or non-i.i.d, among clients. A promising solution is to personalize models for each client, e.g., keeping some layers locally without aggregation, which is thus called personalized FL. However, previous personalized FL often suffer from sub-optimal utility because their choice of layer personalization is based on empirical knowledge and fixed for different datasets and distributions. In this work, we design PFedEdit, the first federated learning framework that leverages automated model editing to optimize the choice of personalization layers and improve model utility under a variety of data distributions including non-i.i.d. The high-level idea of PFedEdit is to assess the effectiveness of every global model layer in improving model utility on local data distribution once edited, and then to apply edits on the top-k most effective layers. Our evaluation shows that PFedEdit outperforms six state-of-the-art approaches on three benchmark datasets by 6% on the model's performance on average, with the largest accuracy improvement being 26.6%. PFedEdit is open-source and available at this repository: https://github.com/Haolin-Yuan/PFedEdit
引用
收藏
页码:91 / 107
页数:17
相关论文
共 50 条
  • [41] Methods and Prospects of Personalized Federated Learning
    Sun, Yanhua
    Wang, Zihang
    Liu, Chang
    Yang, Ruizhe
    Li, Meng
    Wang, Zhuwei
    Computer Engineering and Applications, 2024, 60 (20) : 68 - 83
  • [42] Personalized Federated Learning with Semisupervised Distillation
    Li, Xianxian
    Gong, Yanxia
    Liang, Yuan
    Wang, Li-e
    SECURITY AND COMMUNICATION NETWORKS, 2021, 2021
  • [43] Gradient Free Personalized Federated Learning
    Chen, Haoyu
    Zhang, Yuxin
    Zhao, Jin
    Wang, Xin
    Xu, Yuedong
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 971 - 980
  • [44] Clustered Graph Federated Personalized Learning
    Gauthier, Francois
    Gogineni, Vinay Chakravarthi
    Werner, Stefan
    Huang, Yih-Fang
    Kuh, Anthony
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 744 - 748
  • [45] Reliable and Interpretable Personalized Federated Learning
    Qin, Zixuan
    Yang, Liu
    Wang, Qilong
    Han, Yahong
    Hu, Qinghua
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 20422 - 20431
  • [46] Personalized Federated Learning With Differential Privacy
    Hu, Rui
    Guo, Yuanxiong
    Li, Hongning
    Pei, Qingqi
    Gong, Yanmin
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (10) : 9530 - 9539
  • [47] Location prediction with personalized federated learning
    Wang, Shuang
    Wang, Bowei
    Yao, Shuai
    Qu, Jiangqin
    Pan, Yuezheng
    SOFT COMPUTING, 2022, 28 (Suppl 2) : 451 - 451
  • [48] AFL: Adaptive Federated Learning Based on Personalized Model and Adaptive Communication
    Wu, Xing
    Liu, Fei Xiang
    Zhao, Yue
    Zhao, Ming
    NEW TRENDS IN INTELLIGENT SOFTWARE METHODOLOGIES, TOOLS AND TECHNIQUES, 2021, 337 : 359 - 366
  • [49] Personalized Federated Learning with Moreau Envelopes
    Dinh, Canh T.
    Tran, Nguyen H.
    Nguyen, Tuan Dung
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [50] ActPerFL: Active Personalized Federated Learning
    Chen, Huili
    Ding, Jie
    Tramel, Eric
    Wu, Shuang
    Sahu, Anit Kumar
    Avestimehr, Salman
    Zhang, Tao
    PROCEEDINGS OF THE FIRST WORKSHOP ON FEDERATED LEARNING FOR NATURAL LANGUAGE PROCESSING (FL4NLP 2022), 2022, : 1 - 5