Feature-Level Attention Based Sentence Encoding for Neural Relation Extraction

被引:5
|
作者
Dai, Longqi [1 ]
Xu, Bo [1 ]
Song, Hui [1 ]
机构
[1] Donghua Univ, Sch Comp Sci & Techol, Shanghai, Peoples R China
关键词
Relation extraction; Feature-level attention; Attention strategies;
D O I
10.1007/978-3-030-32233-5_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation extraction is an important task in NLP for knowledge graph and question answering. Traditional relation extraction models simply concatenate all the features as neural network model input, ignoring the different contribution of the features to the semantic representation of entities relations. In this paper, we propose a feature-level attention model to encode sentences, which tries to reveal the different effects of features for relation prediction. In the experiments, we systematically studied the effects of three strategies of attention mechanisms, which demonstrates that scaled dot product attention is better than others. Our experiments on real-world dataset demonstrate that the proposed model achieves significant and consistent improvement in the relation extraction task compared with baselines.
引用
收藏
页码:184 / 196
页数:13
相关论文
共 50 条
  • [1] A feature-level attention-based deep neural network model for sentence embedding
    Bouraoui A.
    Jamoussi S.
    Hamadou A.B.
    International Journal of Intelligent Systems Technologies and Applications, 2022, 20 (05) : 414 - 435
  • [2] Distant Supervised Relation Extraction Based on Sentence-Level Attention with Relation Alignment
    Li, Jing
    Huang, Xingjie
    Gao, Yating
    Liu, Jianyi
    Zhang, Ru
    Zhao, Jinmeng
    ARTIFICIAL INTELLIGENCE AND SECURITY, ICAIS 2022, PT I, 2022, 13338 : 142 - 152
  • [3] Unsupervised Relation Extraction Using Sentence Encoding
    Ali, Manzoor
    Saleem, Muhammad
    Ngomo, Axel-Cyrille Ngonga
    SEMANTIC WEB: ESWC 2021 SATELLITE EVENTS, 2021, 12739 : 136 - 140
  • [4] Feature-Level Attentive Neural Model for Session-Based Recommendation
    Yang, Qing
    Luo, Peicheng
    Cheng, Xinghe
    Li, Ning
    Zhang, Jingwei
    IEEE ACCESS, 2020, 8 : 132582 - 132591
  • [5] Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
    Ji, Guoliang
    Liu, Kang
    He, Shizhu
    Zhao, Jun
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3060 - 3066
  • [6] Opinion Mining Based on Feature-Level
    Liu, Lizhen
    Lv, Zhixin
    Wang, Hanshi
    2012 5TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP), 2012, : 1596 - 1600
  • [7] Opinion Mining Feature-Level Using Naive Bayes And Feature Extraction Based Analysis Dependencies
    Sanda, Regi
    Baizal, Z. K. Abdurahinan
    Nhita, Fhira
    1ST INTERNATIONAL CONFERENCE ON ACTUARIAL SCIENCE AND STATISTICS (ICASS 2014), 2015, 1692
  • [8] Method for multi-band image feature-level fusion based on the attention mechanism
    Yang, Xiaoli
    Lin, Suzhen
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 120 - 127
  • [9] Probing Linguistic Features of Sentence-Level Representations in Neural Relation Extraction
    Alt, Christoph
    Gabryszak, Aleksandra
    Hennig, Leonhard
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1534 - 1545
  • [10] Combined CNN LSTM with attention for speech emotion recognition based on feature-level fusion
    Liu Y.
    Chen A.
    Zhou G.
    Yi J.
    Xiang J.
    Wang Y.
    Multimedia Tools and Applications, 2024, 83 (21) : 59839 - 59859