Neural Relation Extraction with Selective Attention over Instances

被引:0
|
作者
Lin, Yankai [1 ]
Shen, Shiqi [1 ]
Liu, Zhiyuan [1 ,2 ]
Luan, Huanbo [1 ]
Sun, Maosong [1 ,2 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, State Key Lab Intelligent Technol & Syst, Natl Lab Informat Sci & Technol, Beijing, Peoples R China
[2] Jiangsu Collaborat Innovat Ctr Language Competenc, Nanjing, Jiangsu, Peoples R China
来源
PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1 | 2016年
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distant supervised relation extraction has been widely used to find novel relational facts from text. However, distant supervision inevitably accompanies with the wrong labelling problem, and these noisy data will substantially hurt the performance of relation extraction. To alleviate this issue, we propose a sentence-level attention-based model for relation extraction. In this model, we employ convolutional neural networks to embed the semantics of sentences. Afterwards, we build sentence-level attention over multiple instances, which is expected to dynamically reduce the weights of those noisy instances. Experimental results on real-world datasets show that, our model can make full use of all informative sentences and effectively reduce the influence of wrong labelled instances. Our model achieves significant and consistent improvements on relation extraction as compared with baselines. The source code of this paper can be obtained from https://github.com/thunlp/NRE.
引用
收藏
页码:2124 / 2133
页数:10
相关论文
共 50 条
  • [1] Neural Relation Classification Using Selective Attention and Symmetrical Directional Instances
    Tan, Zhen
    Li, Bo
    Huang, Peixin
    Ge, Bin
    Xiao, Weidong
    SYMMETRY-BASEL, 2018, 10 (09):
  • [2] Self-selective attention using correlation between instances for distant supervision relation extraction
    Zhou, Yanru
    Pan, Limin
    Bai, Chongyou
    Luo, Senlin
    Wu, Zhouting
    NEURAL NETWORKS, 2021, 142 : 213 - 220
  • [3] Lifelong learning with selective attention over seen classes and memorized instances
    Wang, Zhijun
    Wang, Hongxing
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (15): : 8473 - 8484
  • [4] Lifelong learning with selective attention over seen classes and memorized instances
    Zhijun Wang
    Hongxing Wang
    Neural Computing and Applications, 2024, 36 : 8473 - 8484
  • [5] Graph neural networks with selective attention and path reasoning for document-level relation extraction
    Hang, Tingting
    Feng, Jun
    Wang, Yunfeng
    Yan, Le
    APPLIED INTELLIGENCE, 2024, 54 (07) : 5353 - 5372
  • [6] Distant supervision for relation extraction with hierarchical selective attention
    Zhou, Peng
    Xu, Jiaming
    Qi, Zhenyu
    Bao, Hongyun
    Chen, Zhineng
    Xu, Bo
    NEURAL NETWORKS, 2018, 108 : 240 - 247
  • [7] Beyond Word Attention: Using Segment Attention in Neural Relation Extraction
    Yu, Bowen
    Zhang, Zhenyu
    Liu, Tingwen
    Wang, Bin
    Li, Sujian
    Li, Quangang
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5401 - 5407
  • [8] Neural Relation Extraction with Multi-lingual Attention
    Lin, Yankai
    Liu, Zhiyuan
    Sun, Maosong
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 34 - 43
  • [9] Relation Instances Creation Algorithm for Relation Extraction System
    Zayats, Antonina
    Zayats, Mariya
    PROCEEDINGS OF XIIITH INTERNATIONAL CONFERENCE - EXPERIENCE OF DESIGNING AND APPLICATION OF CAD SYSTEMS IN MICROELECTRONICS CADSM 2015, 2015, : 478 - 480
  • [10] Distant supervised relation extraction with position feature attention and selective bag attention
    Wang, Jiasheng
    Liu, Qiongxin
    NEUROCOMPUTING, 2021, 461 : 552 - 561