Beyond Word Attention: Using Segment Attention in Neural Relation Extraction

被引:0
|
作者
Yu, Bowen [1 ,2 ]
Zhang, Zhenyu [1 ,2 ]
Liu, Tingwen [1 ]
Wang, Bin [3 ]
Li, Sujian [4 ]
Li, Quangang [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
[3] Xiaomi Inc, Xiaomi AI Lab, Beijing, Peoples R China
[4] Peking Univ, Key Lab Computat Linguist, MOE, Beijing, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation extraction studies the issue of predicting semantic relations between pairs of entities in sentences. Attention mechanisms are often used in this task to alleviate the inner-sentence noise by performing soft selections of words independently. Based on the observation that information pertinent to relations is usually contained within segments (continuous words in a sentence), it is possible to make use of this phenomenon for better extraction. In this paper, we aim to incorporate such segment information into neural relation extractor. Our approach views the attention mechanism as linear-chain conditional random fields over a set of latent variables whose edges encode the desired structure, and regards attention weight as the marginal distribution of each word being selected as a part of the relational expression. Experimental results show that our method can attend to continuous relational expressions without explicit annotations, and achieve the state-of-the-art performance on the large-scale TACRED dataset.
引用
收藏
页码:5401 / 5407
页数:7
相关论文
共 50 条
  • [1] Distant supervision for neural relation extraction integrated with word attention and property features
    Qu, Jianfeng
    Ouyang, Dantong
    Hua, Wen
    Ye, Yuxin
    Li, Ximing
    NEURAL NETWORKS, 2018, 100 : 59 - 69
  • [2] Recurrent neural networks with segment attention and entity description for relation extraction from clinical texts
    Li, Zhi
    Yang, Jinshan
    Gou, Xu
    Qi, Xiaorong
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2019, 97 : 9 - 18
  • [3] Neural Relation Extraction with Multi-lingual Attention
    Lin, Yankai
    Liu, Zhiyuan
    Sun, Maosong
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 34 - 43
  • [4] Neural Relation Extraction with Selective Attention over Instances
    Lin, Yankai
    Shen, Shiqi
    Liu, Zhiyuan
    Luan, Huanbo
    Sun, Maosong
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2016, : 2124 - 2133
  • [5] Relation Extraction in Vietnamese Text via Piecewise Convolution Neural Network with Word-Level Attention
    Van-Nhat Nguyen
    Ha-Thanh Nguyen
    Dinh-Hieu Vo
    Le-Minh Nguyen
    PROCEEDINGS OF 2018 5TH NAFOSTED CONFERENCE ON INFORMATION AND COMPUTER SCIENCE (NICS 2018), 2018, : 99 - 103
  • [6] A Deep Attention Network for Chinese Word Segment
    Li, Lanxin
    Gong, Ping
    Ji, Likun
    ICMLC 2019: 2019 11TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING, 2019, : 528 - 532
  • [7] Traditional Chinese medicine entity relation extraction based on CNN with segment attention
    Tian Bai
    Haotian Guan
    Shang Wang
    Ye Wang
    Lan Huang
    Neural Computing and Applications, 2022, 34 : 2739 - 2748
  • [8] Traditional Chinese medicine entity relation extraction based on CNN with segment attention
    Bai, Tian
    Guan, Haotian
    Wang, Shang
    Wang, Ye
    Huang, Lan
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (04): : 2739 - 2748
  • [9] Dual CNN for Relation Extraction with Knowledge-Based Attention and Word Embeddings
    Li, Jun
    Huang, Guimin
    Chen, Jianheng
    Wang, Yabing
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2019, 2019
  • [10] Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction
    Li, Weijiang
    Wang, Qing
    Wu, Jiekun
    Yu, Zhengtao
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4599 - 4609