FedPFT: Federated Proxy Fine-Tuning of Foundation Models

被引:0
|
作者
Peng, Zhaopeng [1 ]
Fan, Xiaoliang [1 ]
Chen, Yufan [1 ]
Wang, Zheng [1 ]
Pan, Shirui [2 ]
Wen, Chenglu [1 ]
Zhang, Ruisheng [3 ]
Wang, Cheng [1 ]
机构
[1] Xiamen Univ, Sch Informat, Fujian Key Lab Sensing & Comp Smart Cities, Xiamen, Fujian, Peoples R China
[2] Griffith Univ, Sch Informat & Commun Technol, Nathan, Qld, Australia
[3] Lanzhou Univ, Sch Informat Sci & Engn, Lanzhou, Gansu, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Adapting Foundation Models (FMs) for downstream tasks through Federated Learning (FL) emerges a promising strategy for protecting data privacy and valuable FMs. Existing methods finetune FM by allocating sub-FM to clients in FL, however, leading to suboptimal performance due to insufficient tuning and inevitable error accumulations of gradients. In this paper, we propose Federated Proxy Fine-Tuning (FedPFT), a novel method enhancing FMs adaptation in downstream tasks through FL by two key modules. First, the sub-FM construction module employs a layer-wise compression approach, facilitating comprehensive FM fine-tuning across all layers by emphasizing those crucial neurons. Second, the sub-FM alignment module conducts a two-step distillations-layerlevel and neuron-level-before and during FL finetuning respectively, to reduce error of gradient by accurately aligning sub-FM with FM under theoretical guarantees. Experimental results on seven commonly used datasets (i.e., four text and three vision) demonstrate the superiority of FedPFT. Our code is available at https://github.com/pzpdzd/FedPFT.
引用
收藏
页码:4806 / 4814
页数:9
相关论文
共 50 条
  • [21] Device-Edge Cooperative Fine-Tuning of Foundation Models as a 6G Service
    Wu, Hai
    Chen, Xu
    Huang, Kaibin
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (03) : 60 - 67
  • [22] Fine-tuning
    不详
    AVIATION WEEK & SPACE TECHNOLOGY, 2001, 155 (02): : 21 - 21
  • [23] Fine-Tuning
    Manson, Neil A.
    TPM-THE PHILOSOPHERS MAGAZINE, 2019, (86): : 99 - 105
  • [24] Fine-tuning
    Rachel Smallridge
    Nature Reviews Molecular Cell Biology, 2004, 5 (2) : 79 - 79
  • [25] Fine-tuning
    不详
    MECHANICAL ENGINEERING, 2007, 129 (03) : 23 - 23
  • [26] Ymir: A Scheduler for Foundation Model Fine-tuning Workloads in Datacenters
    Gao, Wei
    Zhuang, Weiming
    Li, Minghao
    Sun, Peng
    Wen, Yonggang
    Zhang, Tianwei
    PROCEEDINGS OF THE 38TH ACM INTERNATIONAL CONFERENCE ON SUPERCOMPUTING, ACM ICS 2024, 2024, : 259 - 271
  • [27] FedSelect: Personalized Federated Learning with Customized Selection of Parameters for Fine-Tuning
    Tamirisa, Rishub
    Xie, Chulin
    Bao, Wenxuan
    Zhou, Andy
    Arel, Ron
    Shamsian, Aviv
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 23985 - 23994
  • [28] Federated Fine-Tuning of LLMs on the Very Edge: The Good, the Bad, the Ugly
    Woisetschlaeger, Herbert
    Erben, Alexander
    Wang, Shiqiang
    Mayer, Ruben
    Jacobsen, Hans-Arno
    PROCEEDINGS OF THE 8TH WORKSHOP ON DATA MANAGEMENT FOR END-TO-END MACHINE LEARNING, DEEM 2024, 2024,
  • [29] Equi-Tuning: Group Equivariant Fine-Tuning of Pretrained Models
    Basu, Sourya
    Sattigeri, Prasanna
    Ramamurthy, Karthikeyan Natesan
    Chenthamarakshan, Vijil
    Varshney, Kush R.
    Varshney, Lav R.
    Das, Payel
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 6788 - 6796
  • [30] FINE-TUNING FINE CHEMICALS
    ROYSE, S
    EUROPEAN CHEMICAL NEWS, 1995, 64 (1693): : 28 - &