Neural Relation Extraction with Selective Attention over Instances

被引:0
|
作者
Lin, Yankai [1 ]
Shen, Shiqi [1 ]
Liu, Zhiyuan [1 ,2 ]
Luan, Huanbo [1 ]
Sun, Maosong [1 ,2 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, State Key Lab Intelligent Technol & Syst, Natl Lab Informat Sci & Technol, Beijing, Peoples R China
[2] Jiangsu Collaborat Innovat Ctr Language Competenc, Nanjing, Jiangsu, Peoples R China
来源
PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1 | 2016年
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distant supervised relation extraction has been widely used to find novel relational facts from text. However, distant supervision inevitably accompanies with the wrong labelling problem, and these noisy data will substantially hurt the performance of relation extraction. To alleviate this issue, we propose a sentence-level attention-based model for relation extraction. In this model, we employ convolutional neural networks to embed the semantics of sentences. Afterwards, we build sentence-level attention over multiple instances, which is expected to dynamically reduce the weights of those noisy instances. Experimental results on real-world datasets show that, our model can make full use of all informative sentences and effectively reduce the influence of wrong labelled instances. Our model achieves significant and consistent improvements on relation extraction as compared with baselines. The source code of this paper can be obtained from https://github.com/thunlp/NRE.
引用
收藏
页码:2124 / 2133
页数:10
相关论文
共 50 条
  • [31] A survey on neural relation extraction
    Liu Kang
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2020, 63 (10) : 1971 - 1989
  • [32] Neural relation extraction: a review
    Aydar, Mehmet
    Bozal, Ozge
    Ozbay, Furkan
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2021, 29 (02) : 1029 - 1043
  • [33] A survey on neural relation extraction
    LIU Kang
    Science China(Technological Sciences), 2020, 63 (10) : 1971 - 1989
  • [34] Recurrent neural networks with segment attention and entity description for relation extraction from clinical texts
    Li, Zhi
    Yang, Jinshan
    Gou, Xu
    Qi, Xiaorong
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2019, 97 : 9 - 18
  • [35] Neural Mechanisms of Selective Visual Attention
    Moore, Tirin
    Zirnsak, Marc
    ANNUAL REVIEW OF PSYCHOLOGY, VOL 68, 2017, 68 : 47 - 72
  • [36] NEURAL MECHANISMS OF VISUAL SELECTIVE ATTENTION
    MANGUN, GR
    PSYCHOPHYSIOLOGY, 1995, 32 (01) : 4 - 18
  • [37] Neural basis of visual selective attention
    Chelazzi, Leonardo
    Della Libera, Chiara
    Sani, Ilaria
    Santandrea, Elisa
    WILEY INTERDISCIPLINARY REVIEWS-COGNITIVE SCIENCE, 2011, 2 (04) : 392 - 407
  • [38] Self-Attention Over Tree for Relation Extraction With Data-Efficiency and Computational Efficiency
    Lyu, Shengfei
    Zhou, Xiren
    Wu, Xingyu
    Chen, Qiuju
    Chen, Huanhuan
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (02): : 1253 - 1263
  • [39] Improving Relation Extraction with Knowledge-attention
    Li, Pengfei
    Mao, Kezhi
    Yang, Xuefeng
    Li, Qi
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 229 - 239
  • [40] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793