Enhance prototypical networks with hybrid attention and confusing loss function for few-shot relation classification

被引:10
|
作者
Li, Yibing [1 ,2 ,3 ]
Ma, Zuchang [1 ]
Gao, Lisheng [1 ]
Wu, Yichen [1 ,2 ,4 ]
Xie, Fei [3 ]
Ren, Xiaoye [3 ]
机构
[1] Chinese Acad Sci, Hefei Inst Phys Sci, Inst Intelligent Machines, Anhui Prov Key Lab Med Phys & Technol, Hefei 230031, Peoples R China
[2] Univ Sci & Technol China, Sci Isl Branch Grad Sch, Hefei 230026, Peoples R China
[3] Hefei Normal Univ, Sch Comp Sci & Technol, Hefei 230601, Peoples R China
[4] Anhui Jianzhu Univ, Sch Elect & Informat Engn, Hefei 230601, Peoples R China
基金
中国国家自然科学基金;
关键词
Relation classification; Few-shot learning; Hybrid attention; Loss; BERT;
D O I
10.1016/j.neucom.2022.04.067
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation classification (RC) is a fundamental task to building knowledge graphs and describing semantic formalization. It aims to classify a relation between the head and the tail entities in a sentence. The existing RC method mainly adopts the distant supervision (DS) scheme. However, DS still has the problem of long-tail and suffers from data sparsity. Recently, few-shot learning (FSL) has attracted people's attention. It solves the long-tail problem by learning from few-shot samples. The prototypical networks have a better effect on FSL, which classifies a relation by distance. However, the prototypical networks and their related variants did not consider the critical role of entity words. In addition, not all sentences in support set equally contributed to classifying relations. Furthermore, an entity pair in a sentence may have true and confusing relations, which is difficult for the RC model to distinguish them. A new context encoder BERT_FE is proposed to address those problems, which uses the BERT model as pre-training and fuses the information of head and tail entities by entity word-level attention (WLA). At the same time, the sentence-level attention (SLA) is proposed to give more weight to sentences of the support set similar to the query instance and improve the classification accuracy. A confusing loss function (CLF) is designed to enhance the model's ability to distinguish between true and confusing relations. The experiment results demonstrate that our proposed model (HACLF) is better than several baseline models. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:362 / 372
页数:11
相关论文
共 50 条
  • [41] Survey of Few-Shot Relation Classification
    Liu, Tao
    Ke, Zunwang
    Wushour
    Computer Engineering and Applications, 2023, 59 (09) : 1 - 2
  • [42] Behavior regularized prototypical networks for semi-supervised few-shot image classification
    Huang, Shixin
    Zeng, Xiangping
    Wu, Si
    Yu, Zhiwen
    Azzam, Mohamed
    Wong, Hau-San
    PATTERN RECOGNITION, 2021, 112
  • [43] Center Loss Guided Prototypical Networks for Unbalance Few-Shot Industrial Fault Diagnosis
    Yu, Tong
    Guo, Haobin
    Zhu, Yiyi
    MOBILE INFORMATION SYSTEMS, 2022, 2022
  • [44] Dual Prototypical Network for Robust Few-shot Image Classification
    Song, Qi
    Peng, Zebin
    Ji, Luchen
    Yang, Xiaochen
    Li, Xiaoxu
    PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 533 - 537
  • [45] Bidirectional Matching Prototypical Network for Few-Shot Image Classification
    Fu, Wen
    Zhou, Li
    Chen, Jie
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 982 - 986
  • [46] Global Prototypical Network for Few-Shot Hyperspectral Image Classification
    Zhang, Chengye
    Yue, Jun
    Qin, Qiming
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2020, 13 (13) : 4748 - 4759
  • [47] Bimodal semantic fusion prototypical network for few-shot classification
    Huang, Xilang
    Choi, Seon Han
    INFORMATION FUSION, 2024, 109
  • [48] Hybrid attentive prototypical network for few-shot action recognition
    Ruan, Zanxi
    Wei, Yingmei
    Guo, Yanming
    Xie, Yuxiang
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (06) : 8249 - 8272
  • [49] Cross Attention Network for Few-shot Classification
    Hou, Ruibing
    Chang, Hong
    Ma, Bingpeng
    Shan, Shiguang
    Chen, Xilin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [50] Multiscale attention for few-shot image classification
    Zhou, Tong
    Dong, Changyin
    Song, Junshu
    Zhang, Zhiqiang
    Wang, Zhen
    Chang, Bo
    Chen, Dechun
    COMPUTATIONAL INTELLIGENCE, 2024, 40 (02)