Dynamic Adapter Meets Prompt Tuning: Parameter-Efficient Transfer Learning for Point Cloud Analysis

被引:4
|
作者
Zhou, Xin [1 ]
Liang, Dingkang [1 ]
Xu, Wei [1 ]
Zhu, Xingkui [1 ]
Xu, Yihan [1 ]
Zou, Zhikang [2 ]
Bai, Xiang [1 ]
机构
[1] Huazhong Univ Sci & Technol, Wuhan, Hubei, Peoples R China
[2] Baidu Inc, Beijing, Peoples R China
关键词
D O I
10.1109/CVPR52733.2024.01393
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Point cloud analysis has achieved outstanding performance by transferring point cloud pre-trained models. However, existing methods for model adaptation usually update all model parameters, i.e., full fine-tuning paradigm, which is inefficient as it relies on high computational costs ( e.g., training GPU memory) and massive storage space. In this paper, we aim to study parameter-efficient transfer learning for point cloud analysis with an ideal trade-off between task performance and parameter efficiency. To achieve this goal, we freeze the parameters of the default pre-trained models and then propose the Dynamic Adapter, which generates a dynamic scale for each token, considering the token significance to the downstream task. We further seamlessly integrate Dynamic Adapter with Prompt Tuning (DAPT) by constructing Internal Prompts, capturing the instance-specific features for interaction. Extensive experiments conducted on five challenging datasets demonstrate that the proposed DAPT achieves superior performance compared to the full fine- tuning counterparts while significantly reducing the trainable parameters and training GPU memory by 95% and 35%, respectively. Code is available at https://github.com/LMD0311/DAPT.
引用
收藏
页码:14707 / 14717
页数:11
相关论文
共 50 条
  • [21] Transferrable DP-Adapter Tuning: A Privacy-Preserving Multimodal Parameter-Efficient Fine-Tuning Framework
    Ji, Lixia
    Xiao, Shijie
    Xu, Bingzhi
    Zhang, Han
    2024 IEEE 24TH INTERNATIONAL CONFERENCE ON SOFTWARE QUALITY, RELIABILITY AND SECURITY, QRS, 2024, : 471 - 482
  • [22] LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models
    Hu, Zhiqiang
    Wang, Lei
    Lan, Yihuai
    Xu, Wanyu
    Lim, Ee-Peng
    Bing, Lidong
    Xu, Xing
    Poria, Soujanya
    Lee, Roy Ka-Wei
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 5254 - 5276
  • [23] Parameter-Efficient Transfer Learning for Medical Visual Question Answering
    Liu, Jiaxiang
    Hu, Tianxiang
    Zhang, Yan
    Feng, Yang
    Hao, Jin
    Lv, Junhui
    Liu, Zuozhu
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (04): : 2816 - 2826
  • [24] LoRAPrune: Structured Pruning Meets Low-Rank Parameter-Efficient Fine-Tuning
    Zhang, Mingyang
    Chen, Hao
    Shen, Chunhua
    Yang, Zhen
    Ou, Linlin
    Yu, Xinyi
    Zhuang, Bohan
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 3013 - 3026
  • [25] Automatic depression severity assessment with deep learning using parameter-efficient tuning
    Lau, Clinton
    Zhu, Xiaodan
    Chan, Wai-Yip
    FRONTIERS IN PSYCHIATRY, 2023, 14
  • [26] PreAdapter: Sparse Adaptive Parameter-efficient Transfer Learning for Language Models
    Mao, Chenyang
    Jin, Xiaoxiao
    Yue, Dengfeng
    Leng, Tuo
    2024 7TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA, ICAIBD 2024, 2024, : 218 - 225
  • [27] Parameter-Efficient Transfer Learning for Audio-Visual-Language Tasks
    Liu, Hongye
    Xie, Xianhai
    Gao, Yang
    Yu, Zhou
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 387 - 396
  • [28] SCALEARN: Simple and Highly Parameter-Efficient Task Transfer by Learning to Scale
    Frohmann, Markus
    Holtermann, Carolin
    Masoudian, Shahed
    Lauscher, Anne
    Rekabsaz, Navid
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 11743 - 11776
  • [29] Pass-Tuning: Towards Structure-Aware Parameter-Efficient Tuning for Code Representation Learning
    Chen, Nuo
    Sun, Qiushi
    Wang, Jianing
    Li, Xiang
    Gao, Ming
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 577 - 591
  • [30] Towards Robust and Generalized Parameter-Efficient Fine-Tuning for Noisy Label Learning
    Kim, Yeachan
    Kim, Junho
    Lee, SangKeun
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 5922 - 5936