Meta-prompt based learning for low-resource false information detection

被引:5
|
作者
Huang Y. [1 ,2 ]
Gao M. [1 ,2 ]
Wang J. [1 ,2 ]
Yin J. [1 ,2 ]
Shu K. [3 ]
Fan Q. [1 ,2 ]
Wen J. [1 ,2 ]
机构
[1] Key Laboratory of Dependable Service Computing in Cyber Physical Society (Chongqing University), Ministry of Education, Chongqing
[2] School of Big Data and Software Engineering, Chongqing University, Chongqing
[3] Department of Computer Science, Illinois Institute of Technology, Chicago, IL
来源
Information Processing and Management | 2023年 / 60卷 / 03期
基金
中国国家自然科学基金;
关键词
False information detection; Meta learning; Prompt learning;
D O I
10.1016/j.ipm.2023.103279
中图分类号
学科分类号
摘要
The wide spread of false information has detrimental effects on society, and false information detection has received wide attention. When new domains appear, the relevant labeled data is scarce, which brings severe challenges to the detection. Previous work mainly leverages additional data or domain adaptation technology to assist detection. The former would lead to a severe data burden; the latter underutilizes the pre-trained language model because there is a gap between the downstream task and the pre-training task, which is also inefficient for model storage because it needs to store a set of parameters for each domain. To this end, we propose a meta-prompt based learning (MAP) framework for low-resource false information detection. We excavate the potential of pre-trained language models by transforming the detection tasks into pre-training tasks by constructing template. To solve the problem of the randomly initialized template hindering excavation performance, we learn optimal initialized parameters by borrowing the benefit of meta learning in fast parameter training. The combination of meta learning and prompt learning for the detection is non-trivial: Constructing meta tasks to get initialized parameters suitable for different domains and setting up the prompt model's verbalizer for classification in the noisy low-resource scenario are challenging. For the former, we propose a multi-domain meta task construction method to learn domain-invariant meta knowledge. For the latter, we propose a prototype verbalizer to summarize category information and design a noise-resistant prototyping strategy to reduce the influence of noise data. Extensive experiments on real-world data demonstrate the superiority of the MAP in new domains of false information detection. © 2023 Elsevier Ltd
引用
收藏
相关论文
共 50 条
  • [21] Low-Resource Neural Machine Translation Based on Improved Reptile Meta-learning Method
    Wu, Nier
    Hou, Hongxu
    Jia, Xiaoning
    Chang, Xin
    Li, Haoran
    MACHINE TRANSLATION, CCMT 2021, 2021, 1464 : 39 - 50
  • [22] Deep Learning for Audio Event Detection and Tagging on Low-Resource Datasets
    Morfi, Veronica
    Stowell, Dan
    APPLIED SCIENCES-BASEL, 2018, 8 (08):
  • [23] Self-supervised Meta-Prompt Learning with Meta-Gradient Regularization for Few-shot Generalization
    Pan, Kaihang
    Li, Juncheng
    Song, Hongye
    Lin, Jun
    Liu, Xiaozhong
    Tang, Siliang
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 1059 - 1077
  • [24] Knowledge-Aware Meta-learning for Low-Resource Text Classification
    Yao, Huaxiu
    Wu, Yingxin
    Al-Shedivat, Maruan
    Xing, Eric P.
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1814 - 1821
  • [25] Robust Speech Recognition using Meta-learning for Low-resource Accents
    Eledath, Dhanya
    Baby, Arun
    Singh, Shatrughan
    2024 NATIONAL CONFERENCE ON COMMUNICATIONS, NCC, 2024,
  • [26] A Lightweight Task-Agreement Meta Learning for Low-Resource Speech Recognition
    Chen, Yaqi
    Zhang, Hao
    Zhang, Wenlin
    Qu, Dan
    Yang, Xukui
    NEURAL PROCESSING LETTERS, 2024, 56 (04)
  • [27] State Value Generation with Prompt Learning and Self-Training for Low-Resource Dialogue State Tracking
    Gu, Ming
    Yang, Yan
    Chen, Chengcai
    Yu, Zhou
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222
  • [28] MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning
    Xia, Mengzhou
    Zheng, Guoqing
    Mukherjee, Subhabrata
    Shokouhi, Milad
    Neubig, Graham
    Awadallah, Ahmed Hassan
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 499 - 511
  • [29] Navigating Linguistic Diversity: In-Context Learning and Prompt Engineering for Subjectivity Analysis in Low-Resource Languages
    Dwivedi S.
    Ghosh S.
    Dwivedi S.
    SN Computer Science, 5 (4)
  • [30] Prompt Tuning on Graph-Augmented Low-Resource Text Classification
    Wen, Zhihao
    Fang, Yuan
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 9080 - 9095