Multi-Granularity Semantic Aware Graph Model for Reducing Position Bias in Emotion-Cause Pair Extraction

被引:0
|
作者
Bao, Yinan [1 ,2 ]
Mao, Qianwen [1 ,2 ]
Wei, Lingwei [1 ,2 ]
Zhou, Wei [1 ]
Hu, Songlin [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Emotion-Cause Pair Extraction (ECPE) task aims to extract emotions and causes as pairs from documents. We observe that the relative distance distribution of emotions and causes is extremely imbalanced in the typical ECPE dataset. Existing methods have set a fixed size window to capture relations between neighboring clauses. However, they neglect the effective semantic connections between distant clauses, leading to poor generalization ability towards position-insensitive data. To alleviate the problem, we propose a novel Multi-Granularity Semantic Aware Graph model (MGSAG) to incorporate fine-grained and coarse-grained semantic features jointly, without regard to distance limitation. In particular, we first explore semantic dependencies between clauses and keywords extracted from the document that convey fine-grained semantic features, obtaining keywords enhanced clause representations. Besides, a clause graph is also established to model coarse-grained semantic relations between clauses. Experimental results indicate that MGSAG surpasses the existing state-of-the-art ECPE models. Especially, MGSAG outperforms other models significantly in the condition of position-insensitive data.
引用
收藏
页码:1203 / 1213
页数:11
相关论文
共 49 条
  • [31] Multi-granularity bidirectional attention stream machine comprehension method for emotion cause extraction
    Yufeng Diao
    Hongfei Lin
    Liang Yang
    Xiaochao Fan
    Yonghe Chu
    Di Wu
    Kan Xu
    Bo Xu
    Neural Computing and Applications, 2020, 32 : 8401 - 8413
  • [32] Multi-granularity bidirectional attention stream machine comprehension method for emotion cause extraction
    Diao, Yufeng
    Lin, Hongfei
    Yang, Liang
    Fan, Xiaochao
    Chu, Yonghe
    Wu, Di
    Xu, Kan
    Xu, Bo
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (12): : 8401 - 8413
  • [33] A machine reading comprehension model with counterfactual contrastive learning for emotion-cause pair extraction
    Mai, Hanjie
    Zhang, Xuejie
    Wang, Jin
    Zhou, Xiaobing
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (06) : 3459 - 3476
  • [34] A Mutually Auxiliary Multitask Model With Self-Distillation for Emotion-Cause Pair Extraction
    Yu, Jiaxin
    Liu, Wenyuan
    He, Yongjun
    Zhang, Chunyue
    IEEE ACCESS, 2021, 9 : 26811 - 26821
  • [35] A Mutually Auxiliary Multitask Model with Self-Distillation for Emotion-Cause Pair Extraction
    Yu, Jiaxin
    Liu, Wenyuan
    He, Yongjun
    Zhang, Chunyue
    IEEE Access, 2021, 9 : 26811 - 26821
  • [36] FW-ECPE: An Emotion-Cause Pair Extraction Model Based on Fusion Word Vectors
    Song, Xinyi
    Zou, Dongsheng
    Yu, Yi
    Zhang, Xiaotong
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [37] A Unified Target-Oriented Sequence-to-Sequence Model for Emotion-Cause Pair Extraction
    Cheng, Zifeng
    Jiang, Zhiwei
    Yin, Yafeng
    Li, Na
    Gu, Qing
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 2779 - 2791
  • [38] Multi-Task Sequence Tagging for Emotion-Cause Pair Extraction Via Tag Distribution Refinement
    Fan, Chuang
    Yuan, Chaofa
    Gui, Lin
    Zhang, Yue
    Xu, Ruifeng
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 2339 - 2350
  • [39] An end-to-end multi-task learning to link framework for emotion-cause pair extraction
    Song, Haolin
    Song, Dawei
    2021 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING, AND ARTIFICIAL INTELLIGENCE, 2021, 12076
  • [40] Emotion-Cause Pair Extraction via Transformer-Based Interaction Model with Text Capsule Network
    Yang, Cheng
    Ding, Jie
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 781 - 793