Experiencer-Driven and Knowledge-Aware Graph Model for emotion-cause pair extraction

被引:5
|
作者
Li, Min [1 ,2 ,3 ]
Zhao, Hui [1 ,2 ,3 ]
Gu, Tiquan [1 ]
Ying, Di [1 ]
机构
[1] Xinjiang Univ, Coll Informat Sci & Engn, Urumqi, Xinjiang, Peoples R China
[2] Key Lab Signal Detect & Proc Xinjiang Uygur Autono, Urumqi, Peoples R China
[3] Key Lab Multilingual Informat Technol Xinjiang Uyg, Urumqi, Xinjiang, Peoples R China
关键词
Emotion-cause pairs extraction; Experiencer identification; ATOMIC; Causality commonsense; Heterogeneous graph;
D O I
10.1016/j.knosys.2023.110703
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Numerous approaches have been explored to learn relations between emotion and cause clauses as a key step in the extraction of emotion-cause pairs. Despite their effectiveness, there are some limitations of previous studies: (1) they ignored that the emotion experiencer is an important clue implicating causality between clauses and (2) ignored that causal commonsense as prior knowledge can enhance semantic associations between clauses. In this paper, we propose a novel Experiencer-Driven and Knowledge -Aware Graph Model (EDKA-GM). For the first limitation, we introduce an experiencer identification task and present a document-level heterogeneous graph network for capturing global experiencer information to enrich experiencer-based cross-clause association. For the second, we retrieve the causal commonsense from the ATOMIC knowledge base for each clause and establish a knowledge-aware graph network to further enhance the inter-clause relationship by modeling a full connection graph of clauses with commonsense knowledge. We are the first to explore the influence of experiencer on emotion-cause pair extraction. On a benchmark dataset, our approach performs better than competitive baselines, achieving new state-of-the-art performance.& COPY; 2023 Published by Elsevier B.V.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Clause Fusion-Based Emotion Embedding Model for Emotion-Cause Pair Extraction
    Li, Zhiwei
    Rao, Guozheng
    Zhang, Li
    Wang, Xin
    Cong, Qing
    Feng, Zhiyong
    WEB AND BIG DATA, PT II, APWEB-WAIM 2022, 2023, 13422 : 38 - 52
  • [22] Emotion-cause pair extraction based on interactive attention
    Weichun Huang
    Yixue Yang
    Xiaohui Huang
    Zhiying Peng
    Liyan Xiong
    Applied Intelligence, 2023, 53 : 10548 - 10558
  • [23] Modularized Mutuality Network for Emotion-Cause Pair Extraction
    Shang, Xichen
    Chen, Chuxin
    Chen, Zipeng
    Ma, Qianli
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 539 - 549
  • [24] Recurrent synchronization network for emotion-cause pair extraction
    Chen, Fang
    Shi, Ziwei
    Yang, Zhongliang
    Huang, Yongfeng
    Knowledge-Based Systems, 2022, 238
  • [25] Emotion-Cause Pair Extraction: A New Task to Emotion Analysis in Texts
    Xia, Rui
    Ding, Zixiang
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1003 - 1012
  • [26] Joint multi-level attentional model for emotion detection and emotion-cause pair extraction
    Tang, Hao
    Ji, Donghong
    Zhou, Qiji
    NEUROCOMPUTING, 2020, 409 : 329 - 340
  • [27] An Emotion-Cause Pair Extraction Model Based on Multichannel Compact Bilinear Pooling
    Huang J.
    Xu S.
    Cai E.
    Wu Z.
    Guo M.
    Zhu J.
    Beijing Daxue Xuebao (Ziran Kexue Ban)/Acta Scientiarum Naturalium Universitatis Pekinensis, 2022, 58 (01): : 21 - 28
  • [28] A semantic structure-based emotion-guided model for emotion-cause pair extraction
    Wang, Yuwei
    Li, Yuling
    Yu, Kui
    Yang, Jing
    PATTERN RECOGNITION, 2025, 161
  • [29] Conversational Emotion-Cause Pair Extraction with Guided Mixture of Experts
    Jeong, DongJin
    Bak, JinYeong
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 3288 - 3298
  • [30] A graph attention network utilizing multi-granular information for emotion-cause pair extraction
    Chen, Siyuan
    Mao, Kezhi
    NEUROCOMPUTING, 2023, 543