Personalised soft prompt tuning in pre-trained language models: Bridging multitask transfer learning and crowdsourcing learning

被引:0
|
作者
Tian, Zeshu [1 ]
Zhang, Hongli [1 ]
Wang, Yan [2 ]
机构
[1] Harbin Inst Technol, Fac Comp, Harbin 150001, Heilongjiang, Peoples R China
[2] Macquarie Univ, Sch Comp, Sydney, NSW 2109, Australia
关键词
Pre-trained language models; Crowdsourcing learning; Soft prompt tuning; Multitask transfer learning; Crowdsourcing information extraction;
D O I
10.1016/j.knosys.2024.112646
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Soft prompt tuning significantly enhances the performance of pre-trained language models, especially in complex tasks where abundant annotated data is available. Crowdsourcing provides a cost-effective means of obtaining large-scale annotations; however, it can hinder the effectiveness of soft prompt tuning due to varying annotation criteria among different annotators, introducing data noise and degrading performance. To address this issue, we conceptualise annotations from each annotator as a subtask and frame crowdsourcing learning as multitask transfer learning. We propose a novel soft prompt tuning method utilising personalised prompts designed to capture the principles of individual annotators through a knowledge distillation approach. To validate our hypothesis, we apply our method across four benchmark datasets in two specific crowdsourcing tasks: crowdsourced named entity recognition (CNER) and crowdsourced relation extraction (CRE). Our personalised soft prompt method shows significant improvements, with average increases of 8.96% in CNER and 14.44% in CRE compared to the standard soft prompt tuning method, while also achieving competitive results against state-of-the-art crowdsourcing methods.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Prompt Tuning for Discriminative Pre-trained Language Models
    Yao, Yuan
    Dong, Bowen
    Zhang, Ao
    Zhang, Zhengyan
    Xie, Ruobing
    Liu, Zhiyuan
    Lin, Leyu
    Sun, Maosong
    Wang, Jianyong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3468 - 3473
  • [2] Meta Distant Transfer Learning for Pre-trained Language Models
    Wang, Chengyu
    Pan, Haojie
    Qiu, Minghui
    Yang, Fei
    Huang, Jun
    Zhang, Yin
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9742 - 9752
  • [3] Prompt Learning with Structured Semantic Knowledge Makes Pre-Trained Language Models Better
    Zheng, Hai-Tao
    Xie, Zuotong
    Liu, Wenqiang
    Huang, Dongxiao
    Wu, Bei
    Kim, Hong-Gee
    ELECTRONICS, 2023, 12 (15)
  • [4] PPT: Pre-trained Prompt Tuning for Few-shot Learning
    Gu, Yuxian
    Han, Xu
    Liu, Zhiyuan
    Huang, Minlie
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 8410 - 8423
  • [5] APrompt: Attention Prompt Tuning for Efficient Adaptation of Pre-trained Language Models
    Wang, Qifan
    Mao, Yuning
    Wang, Jingang
    Yu, Hanchao
    Li, Shaoliang
    Wang, Sinong
    Feng, Fuli
    Huang, Lifu
    Quan, Xiaojun
    Xu, Zenglin
    Liu, Dongfang
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 9147 - 9160
  • [6] CPT: Colorful Prompt Tuning for pre-trained vision-language models
    Yao, Yuan
    Zhang, Ao
    Zhang, Zhengyan
    Liu, Zhiyuan
    Chua, Tat-Seng
    Sun, Maosong
    AI OPEN, 2024, 5 : 30 - 38
  • [7] Towards Inadequately Pre-trained Models in Transfer Learning
    Deng, Andong
    Li, Xingjian
    Hu, Di
    Wang, Tianyang
    Xiong, Haoyi
    Xu, Cheng-Zhong
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 19340 - 19351
  • [8] Transfer learning with pre-trained conditional generative models
    Yamaguchi, Shin'ya
    Kanai, Sekitoshi
    Kumagai, Atsutoshi
    Chijiwa, Daiki
    Kashima, Hisashi
    MACHINE LEARNING, 2025, 114 (04)
  • [9] Investigating Prompt Learning for Chinese Few-Shot Text Classification with Pre-Trained Language Models
    Song, Chengyu
    Shao, Taihua
    Lin, Kejing
    Liu, Dengfeng
    Wang, Siyuan
    Chen, Honghui
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [10] Parameter and Computation Efficient Transfer Learning for Vision-Language Pre-trained Models
    Wu, Qiong
    Yu, Wei
    Zhou, Yiyi
    Huang, Shubin
    Sun, Xiaoshuai
    Ji, Rongrong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,