PTEKC: pre-training with event knowledge of ConceptNet for cross-lingual event causality identification

被引:0
|
作者
Zhu, Enchang [1 ,2 ]
Yu, Zhengtao [1 ,2 ]
Huang, Yuxin [1 ,2 ]
Gao, Shengxiang [1 ,2 ]
Xian, Yantuan [1 ,2 ]
机构
[1] Kunming Univ Sci & Technol, Fac Informat Engn & Automat, Kunming 650500, Yunnan, Peoples R China
[2] Yunnan Key Lab Artificial Intelligence, Kunming 650500, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Event causality identification; Event knowledge; Multilingual pre-trained language models; Parameter-sharing adapter; Pre-training; GRAPH;
D O I
10.1007/s13042-024-02367-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Event causality identification (ECI) aims to identify causal relations between events in texts. Although existing event causality identification works based on fine-tuning pre-trained language models (PLMs) have achieved promising results, they suffer from prohibitive computation costs, catastrophic forgetting of distributional knowledge, as well as poor interpretability. Particularly in low-resource and cross-linguistic scenarios, existing multi-lingual models are generally confronted with the so-called curse of multilinguality, language bias, and hence result in low accuracy and generalization ability. In this paper, we propose a paradigm, termed Pre-training with Event Knowledge of ConceptNet (PTEKC), to couple Multi-lingual Pre-trained Language Models (mPLMs) with event knowledge for cross-lingual event causality identification. Specifically, we have develop a parameter-sharing adapter plugin that facilitates the integration of event knowledge into the frozen PLMs. This approach significantly diminishes the number of trainable parameters and greatly reduces the risk of catastrophic forgetting. Our Adapter integrates multi-lingual alignment event knowledge into the mPLMs through two designed pre-training tasks, namely event masking and self-supervised link prediction. Extensive experiments on the benchmark dataset MECI show that PTEKC is parameter-efficient and can effectively incorporate multi-lingual alignment event knowledge for improving cross-lingual event causality identification.
引用
收藏
页码:1859 / 1872
页数:14
相关论文
共 50 条
  • [31] Contrastive pre-training and instruction tuning for cross-lingual aspect-based sentiment analysis
    Zhao, Wenwen
    Yang, Zhisheng
    Yu, Song
    Zhu, Shiyu
    Li, Li
    APPLIED INTELLIGENCE, 2025, 55 (05)
  • [32] Cross-Lingual Pre-Training Based Transfer for Zero-Shot Neural Machine Translation
    Ji, Baijun
    Zhang, Zhirui
    Duan, Xiangyu
    Zhang, Min
    Chen, Boxing
    Luo, Weihua
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 115 - 122
  • [33] UC2: Universal Cross-lingual Cross-modal Vision-and-Language Pre-training
    Zhou, Mingyang
    Zhou, Luowei
    Wang, Shuohang
    Cheng, Yu
    Li, Linjie
    Yu, Zhou
    Liu, Jingjing
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 4153 - 4163
  • [34] Event Camera Data Dense Pre-training
    Yang, Yan
    Pan, Liyuan
    Liu, Liu
    COMPUTER VISION-ECCV 2024, PT XLIII, 2025, 15101 : 292 - 310
  • [35] Neural Cross-Lingual Event Detection with Minimal Parallel Resources
    Liu, Jian
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 738 - 748
  • [36] Neural Machine Translation Based on XLM-R Cross-lingual Pre-training Language Model
    Wang Q.
    Li M.
    Wu S.
    Wang M.
    Beijing Daxue Xuebao (Ziran Kexue Ban)/Acta Scientiarum Naturalium Universitatis Pekinensis, 2022, 58 (01): : 29 - 36
  • [37] Hybrid Knowledge Transfer for Improved Cross-Lingual Event Detection via Hierarchical Sample Selection
    Guzman-Nateras, Luis F.
    Dernoncourt, Franck
    Nguyen, Thien Huu
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 5414 - 5427
  • [38] Towards Better Understanding Researcher Strategies in Cross-Lingual Event Analytics
    Gottschalk, Simon
    Bernacchi, Viola
    Rogers, Richard
    Demidova, Elena
    DIGITAL LIBRARIES FOR OPEN KNOWLEDGE, TPDL 2018, 2018, 11057 : 139 - 151
  • [39] Med-UniC: Unifying Cross-Lingual Medical Vision-Language Pre-Training by Diminishing Bias
    Wan, Zhongwei
    Liu, Che
    Zhang, Mi
    Fu, Jie
    Wang, Benyou
    Cheng, Sibo
    Ma, Lei
    Quilodran-Casas, Cesar
    Arcucci, Rossella
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [40] Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models
    Huang, Po-Yao
    Patrick, Mandela
    Hu, Junjie
    Neubig, Graham
    Metze, Florian
    Hauptmann, Alexander
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2443 - 2459