FedPFT: Federated Proxy Fine-Tuning of Foundation Models

被引:0
|
作者
Peng, Zhaopeng [1 ]
Fan, Xiaoliang [1 ]
Chen, Yufan [1 ]
Wang, Zheng [1 ]
Pan, Shirui [2 ]
Wen, Chenglu [1 ]
Zhang, Ruisheng [3 ]
Wang, Cheng [1 ]
机构
[1] Xiamen Univ, Sch Informat, Fujian Key Lab Sensing & Comp Smart Cities, Xiamen, Fujian, Peoples R China
[2] Griffith Univ, Sch Informat & Commun Technol, Nathan, Qld, Australia
[3] Lanzhou Univ, Sch Informat Sci & Engn, Lanzhou, Gansu, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Adapting Foundation Models (FMs) for downstream tasks through Federated Learning (FL) emerges a promising strategy for protecting data privacy and valuable FMs. Existing methods finetune FM by allocating sub-FM to clients in FL, however, leading to suboptimal performance due to insufficient tuning and inevitable error accumulations of gradients. In this paper, we propose Federated Proxy Fine-Tuning (FedPFT), a novel method enhancing FMs adaptation in downstream tasks through FL by two key modules. First, the sub-FM construction module employs a layer-wise compression approach, facilitating comprehensive FM fine-tuning across all layers by emphasizing those crucial neurons. Second, the sub-FM alignment module conducts a two-step distillations-layerlevel and neuron-level-before and during FL finetuning respectively, to reduce error of gradient by accurately aligning sub-FM with FM under theoretical guarantees. Experimental results on seven commonly used datasets (i.e., four text and three vision) demonstrate the superiority of FedPFT. Our code is available at https://github.com/pzpdzd/FedPFT.
引用
收藏
页码:4806 / 4814
页数:9
相关论文
共 50 条
  • [41] Fine-tuning language models to recognize semantic relations
    Roussinov, Dmitri
    Sharoff, Serge
    Puchnina, Nadezhda
    LANGUAGE RESOURCES AND EVALUATION, 2023, 57 (04) : 1463 - 1486
  • [42] Fine-tuning language models to recognize semantic relations
    Dmitri Roussinov
    Serge Sharoff
    Nadezhda Puchnina
    Language Resources and Evaluation, 2023, 57 : 1463 - 1486
  • [43] Fine-Tuning Language Models with Just Forward Passes
    Malladi, Sadhika
    Gao, Tianyu
    Nichani, Eshaan
    Damian, Alex
    Lee, Jason D.
    Chen, Danqi
    Arora, Sanjeev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [44] Fine-tuning vision foundation model for crack segmentation in civil infrastructures
    Ge, K.
    Wang, C.
    Guo, Y. T.
    Tang, Y. S.
    Hu, Z. Z.
    Chen, H. B.
    CONSTRUCTION AND BUILDING MATERIALS, 2024, 431
  • [45] VFFG: Verifiable Privacy-Enhanced Federated Fine-Tuning for GPT Service
    Bian, Mingyun
    Ren, Yanli
    He, Guanghui
    Feng, Guorui
    Zhang, Xinpeng
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024,
  • [46] ENHANCING FEDERATED DOMAIN ADAPTATION WITH MULTI-DOMAIN PROTOTYPE-BASED FEDERATED FINE-TUNING
    Zhang, Jingyuan
    Duan, Yiyang
    Niu, Shuaicheng
    Cao, Yang
    Lim, Wei Yang Bryan
    arXiv,
  • [47] FINE-TUNING THE MULTIVERSE
    Metcalf, Thomas
    FAITH AND PHILOSOPHY, 2018, 35 (01) : 3 - 32
  • [48] FINE-TUNING THE FOIA
    KENNEDY, P
    COLUMBIA JOURNALISM REVIEW, 1984, 23 (03) : 8 - 9
  • [49] Natural fine-tuning
    Saleem, Anjum
    Medina, Luisa
    Kunststoffe International, 2019, 109 (04): : 45 - 48
  • [50] Fine-tuning tools
    Arianne Heinrichs
    Nature Reviews Molecular Cell Biology, 2006, 7 : 466 - 466