Entity Relation Joint Extraction Method Based on Insertion Transformers

被引:0
|
作者
Qi, Haotian [1 ]
Liu, Weiguang [1 ]
Liu, Fenghua [1 ]
Zhu, Weigang [1 ]
Shan, Fangfang [1 ]
机构
[1] Zhongyuan Univ Technol, Coll Comp, Zhengzhou 451191, Henan, Peoples R China
关键词
Entity relation extraction; tagging strategy; joint extraction; transformer;
D O I
10.14569/IJACSA.2024.0150467
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Existing multi-module multi-step and multi- module single-step methods for entity relation joint extraction suffer from issues such as cascading errors and redundant mistakes. In contrast, the single-module single-step modeling approach effectively alleviates these limitations. However, the single-module single-step method still faces challenges when dealing with complex relation extraction tasks, such as excessive negative samples and long decoding times. To address these issues, this paper proposes an entity relation joint extraction method based on Insertion Transformers, which adopts the single-module single-step approach and integrates the newly proposed tagging strategy. This method iteratively identifies and inserts tags in the text, and then effectively reduces decoding time and the count of negative samples by leveraging attention mechanisms combined with contextual information, while also resolving the problem of entity overlap. Compared to the state-of-the-art models on two public datasets, this method achieves high F1 scores of 93.2% and 91.5%, respectively, demonstrating its efficiency in resolving entity overlap issues.
引用
收藏
页码:656 / 664
页数:9
相关论文
共 50 条
  • [41] Joint Entity and Relation Extraction With Set Prediction Networks
    Sui, Dianbo
    Zeng, Xiangrong
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12784 - 12795
  • [42] Attention Weight is Indispensable in Joint Entity and Relation Extraction
    Ouyang, Jianquan
    Zhang, Jing
    Liu, Tianming
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2022, 34 (03): : 1707 - 1723
  • [43] Joint model of entity recognition and relation extraction based on artificial neural network
    Zhang, Zhu
    Zhan, Shu
    Zhang, Haiyan
    Li, Xinke
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2020, 13 (7) : 3503 - 3511
  • [44] A novel entity joint annotation relation extraction model
    Xu, Meng
    Pi, Dechang
    Cao, Jianjun
    Yuan, Shuilian
    APPLIED INTELLIGENCE, 2022, 52 (11) : 12754 - 12770
  • [45] Joint Learning of Named Entity Recognition and Relation Extraction
    Xu, Qiuyan
    Li, Fang
    2011 INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT), VOLS 1-4, 2012, : 1978 - 1982
  • [46] Joint entity and relation extraction with table filling based on graph convolutional Networks
    Jia, Wei
    Ma, Ruizhe
    Yan, Li
    Niu, Weinan
    Ma, Zongmin
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 266
  • [47] A joint entity Relation Extraction method for document level Traditional Chinese Medicine texts
    Xu, Wenxuan
    Wang, Lin
    Zhang, Mingchuan
    Zhu, Junlong
    Yan, Junqiang
    Wu, Qingtao
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2024, 154
  • [48] A Relation-Specific Attention Network for Joint Entity and Relation Extraction
    Yuan, Yue
    Zhou, Xiaofei
    Pan, Shirui
    Zhu, Qiannan
    Song, Zeliang
    Guo, Li
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 4054 - 4060
  • [49] GrantRel: Grant Information Extraction via Joint Entity and Relation Extraction
    Bian, Junyi
    Huang, Li
    Huang, Xiaodi
    Zhou, Hong
    Zhu, Shanfeng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2674 - 2685
  • [50] Span-based joint entity and relation extraction augmented with sequence tagging mechanism
    Bin JI
    Shasha LI
    Hao XU
    Jie YU
    Jun MA
    Huijun LIU
    Jing YANG
    Science China(Information Sciences), 2024, 67 (05) : 84 - 98