Beyond Word Attention: Using Segment Attention in Neural Relation Extraction

被引:0
|
作者
Yu, Bowen [1 ,2 ]
Zhang, Zhenyu [1 ,2 ]
Liu, Tingwen [1 ]
Wang, Bin [3 ]
Li, Sujian [4 ]
Li, Quangang [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
[3] Xiaomi Inc, Xiaomi AI Lab, Beijing, Peoples R China
[4] Peking Univ, Key Lab Computat Linguist, MOE, Beijing, Peoples R China
来源
PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE | 2019年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation extraction studies the issue of predicting semantic relations between pairs of entities in sentences. Attention mechanisms are often used in this task to alleviate the inner-sentence noise by performing soft selections of words independently. Based on the observation that information pertinent to relations is usually contained within segments (continuous words in a sentence), it is possible to make use of this phenomenon for better extraction. In this paper, we aim to incorporate such segment information into neural relation extractor. Our approach views the attention mechanism as linear-chain conditional random fields over a set of latent variables whose edges encode the desired structure, and regards attention weight as the marginal distribution of each word being selected as a part of the relational expression. Experimental results show that our method can attend to continuous relational expressions without explicit annotations, and achieve the state-of-the-art performance on the large-scale TACRED dataset.
引用
收藏
页码:5401 / 5407
页数:7
相关论文
共 50 条
  • [31] An Unsupervised Neural Attention Model for Aspect Extraction
    He, Ruidan
    Lee, Wee Sun
    Ng, Hwee Tou
    Dahlmeier, Daniel
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 388 - 397
  • [32] WAGRank: A word ranking model based on word attention graph for keyphrase extraction
    Bian, Rong
    Cheng, Bing
    INTELLIGENT DATA ANALYSIS, 2024,
  • [33] Improved Distant Supervised Model in Tibetan Relation Extraction Using ELMo and Attention
    Sun, Yuan
    Wang, Like
    Chen, Chaofan
    Xia, Tianci
    Zhao, Xiaobing
    IEEE ACCESS, 2019, 7 : 173054 - 173062
  • [34] Semantic relation extraction using sequential and tree-structured LSTM with attention
    Geng, ZhiQiang
    Chen, GuoFei
    Han, YongMing
    Lu, Gang
    Li, Fang
    INFORMATION SCIENCES, 2020, 509 (509) : 183 - 192
  • [35] Specific Relation Attention-Guided Graph Neural Networks for Joint Entity and Relation Extraction in Chinese EMR
    Pang, Yali
    Qin, Xiaohui
    Zhang, Zhichang
    APPLIED SCIENCES-BASEL, 2022, 12 (17):
  • [36] The Relation Between Attention, Inhibition and Word Learning in Young Children
    Sia, Ming Yean
    Holmboe, Karla
    Mani, Nivedita
    COLLABRA-PSYCHOLOGY, 2023, 9 (01)
  • [37] Distant supervision for relation extraction with hierarchical selective attention
    Zhou, Peng
    Xu, Jiaming
    Qi, Zhenyu
    Bao, Hongyun
    Chen, Zhineng
    Xu, Bo
    NEURAL NETWORKS, 2018, 108 : 240 - 247
  • [38] Attention Guided Graph Convolutional Networks for Relation Extraction
    Guo, Zhijiang
    Zhang, Yan
    Lu, Wei
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 241 - 251
  • [39] Temporal Relation Extraction with Joint Semantic and Syntactic Attention
    Jin, Panpan
    Li, Feng
    Li, Xiaoyu
    Liu, Qing
    Liu, Kang
    Ma, Haowei
    Dong, Pengcheng
    Tang, Shulin
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [40] Attention Weight is Indispensable in Joint Entity and Relation Extraction
    Ouyang, Jianquan
    Zhang, Jing
    Liu, Tianming
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2022, 34 (03): : 1707 - 1723