Feature-Level Attention Based Sentence Encoding for Neural Relation Extraction

被引:5
|
作者
Dai, Longqi [1 ]
Xu, Bo [1 ]
Song, Hui [1 ]
机构
[1] Donghua Univ, Sch Comp Sci & Techol, Shanghai, Peoples R China
关键词
Relation extraction; Feature-level attention; Attention strategies;
D O I
10.1007/978-3-030-32233-5_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation extraction is an important task in NLP for knowledge graph and question answering. Traditional relation extraction models simply concatenate all the features as neural network model input, ignoring the different contribution of the features to the semantic representation of entities relations. In this paper, we propose a feature-level attention model to encode sentences, which tries to reveal the different effects of features for relation prediction. In the experiments, we systematically studied the effects of three strategies of attention mechanisms, which demonstrates that scaled dot product attention is better than others. Our experiments on real-world dataset demonstrate that the proposed model achieves significant and consistent improvement in the relation extraction task compared with baselines.
引用
收藏
页码:184 / 196
页数:13
相关论文
共 50 条
  • [41] Phrase-level Self-Attention Networks for Universal Sentence Encoding
    Wu, Wei
    Wang, Houfeng
    Liu, Tianyu
    Ma, Shuming
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3729 - 3738
  • [42] Feature selection based on word-sentence relation
    Schönhofen, P
    Benczúr, AA
    ICMLA 2005: FOURTH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, PROCEEDINGS, 2005, : 37 - 42
  • [43] Color Component Feature Selection in Feature-Level Fusion Based Color Face Recognition
    Lee, Seung Ho
    Choi, Jae Young
    Plataniotis, Konstantinos N.
    Ro, Yong Man
    2010 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE 2010), 2010,
  • [44] Novel OGBEE-based feature selection and feature-level fusion with MLP neural network for social media multimodal sentiment analysis
    Bairavel, S.
    Krishnamurthy, M.
    SOFT COMPUTING, 2020, 24 (24) : 18431 - 18445
  • [45] Chinese Relation Extraction Based on Cross-Attention and Multi-feature Perception
    Xu, Shiao
    Sun, Shuihua
    Zhang, Zhiyuan
    Zhou, Huan
    Journal of Network Intelligence, 2024, 9 (03): : 1837 - 1853
  • [46] Distant supervised relation extraction with position feature attention and selective bag attention
    Wang, Jiasheng
    Liu, Qiongxin
    NEUROCOMPUTING, 2021, 461 : 552 - 561
  • [47] Improved convolutional neural network chiller early fault diagnosis by gradient-based feature-level model interpretation and feature learning
    Li, Guannan
    Chen, Liang
    Fan, Cheng
    Gao, Jiajia
    Xu, Chengliang
    Fang, Xi
    APPLIED THERMAL ENGINEERING, 2024, 236
  • [48] Novel OGBEE-based feature selection and feature-level fusion with MLP neural network for social media multimodal sentiment analysis
    S. Bairavel
    M. Krishnamurthy
    Soft Computing, 2020, 24 : 18431 - 18445
  • [49] Beyond Word Attention: Using Segment Attention in Neural Relation Extraction
    Yu, Bowen
    Zhang, Zhenyu
    Liu, Tingwen
    Wang, Bin
    Li, Sujian
    Li, Quangang
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5401 - 5407
  • [50] Document-Level Relation Extraction Method Based on Attention Semantic Enhancement
    Liu X.
    Wu W.
    Zhao W.
    Hou W.
    Tongji Daxue Xuebao/Journal of Tongji University, 2024, 52 (05): : 822 - 828