A comprehensive exploration of semantic relation extraction via pre-trained CNNs

被引:35
|
作者
Li, Qing [1 ]
Li, Lili [2 ]
Wang, Weinan [3 ]
Li, Qi [4 ]
Zhong, Jiang [1 ,5 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing, Peoples R China
[2] Chongqing Univ, Sch Civil Engn, Chongqing, Peoples R China
[3] Peking Univ, Sch Math Sci, Beijing, Peoples R China
[4] Shaoxing Univ, Dept Comp Sci & Engn, Shaoxing, Peoples R China
[5] Chongqing Univ, Key Lab Dependable Serv Comp Cyber Phys Soc, Chongqing, Peoples R China
关键词
Relation extraction; Semantic relation; Natural language processing; Convolutional neural networks;
D O I
10.1016/j.knosys.2020.105488
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semantic relation extraction between entity pairs is a crucial task in information extraction from text. In this paper, we propose a new pre-trained network architecture for this task, and it is called the XM-CNN. The XM-CNN utilizes word embedding and position embedding information. It is designed to reinforce the contextual output from the MT-DNNKD pre-trained model. Our model effectively utilized an entity-aware attention mechanisms to detected the features and also adopts and applies more relation-specific pooling attention mechanisms applied to it. The experimental results show that the XM-CNN achieves state-of-the-art results on the SemEval-2010 task 8, and a thorough evaluation of the method is conducted. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] Few-shot medical relation extraction via prompt tuning enhanced pre-trained language model
    He, Guoxiu
    Huang, Chen
    NEUROCOMPUTING, 2025, 633
  • [22] Protocol for the automatic extraction of epidemiological information via a pre-trained language model
    Wang, Zhizheng
    Liu, Xiao Fan
    Du, Zhanwei
    Wang, Lin
    Wu, Ye
    Holme, Petter
    Lachmann, Michael
    Lin, Hongfei
    Wang, Zhuoyue
    Cao, Yu
    Wong, Zoie S. Y.
    Xu, Xiao-Ke
    Sun, Yuanyuan
    STAR PROTOCOLS, 2023, 4 (03):
  • [23] Biomedical event extraction using pre-trained SciBERT
    Mulya, Dimmas
    Khodra, Masayu Leylia
    JOURNAL OF INTELLIGENT SYSTEMS, 2023, 32 (01)
  • [24] MaskDiffusion: Exploiting Pre-Trained Diffusion Models for Semantic Segmentation
    Kawano, Yasufumi
    Aoki, Yoshimitsu
    IEEE ACCESS, 2024, 12 : 127283 - 127293
  • [25] Unlocking Pre-trained Image Backbones for Semantic Image Synthesis
    Berrada, Tariq
    Verbeek, Jakob
    Couprie, Camille
    Alahari, Karteek
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2024, 2024, : 7840 - 7849
  • [26] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
    Alt, Christoph
    Huebner, Marc
    Hennig, Leonhard
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398
  • [27] Enhancing Skin Diseases Classification Through Dual Ensemble Learning and Pre-trained CNNs
    El Gannour, Oussama
    Hamida, Soufiane
    Lamalem, Yasser
    Cherradi, Bouchaib
    Saleh, Shawki
    Raihani, Abdelhadi
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (06) : 436 - 445
  • [28] Efficient fusion of handcrafted and pre-trained CNNs features to classify melanoma skin cancer
    Filali, Youssef
    EL Khoukhi, Hasnae
    Sabri, My Abdelouahed
    Aarab, Abdellah
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (41-42) : 31219 - 31238
  • [29] Efficient fusion of handcrafted and pre-trained CNNs features to classify melanoma skin cancer
    Youssef Filali
    Hasnae EL Khoukhi
    My Abdelouahed Sabri
    Abdellah Aarab
    Multimedia Tools and Applications, 2020, 79 : 31219 - 31238
  • [30] Tuning Pre-trained Model via Moment Probing
    Gao, Mingze
    Wang, Qilong
    Lin, Zhenyi
    Zhu, Pengfei
    Hu, Qinghua
    Zhou, Jingbo
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11769 - 11779