Phrase-level attention network for few-shot inverse relation classification in knowledge graph

被引:0
|
作者
Wu, Shaojuan [1 ]
Dou, Chunliu [2 ]
Wang, Dazhuang [1 ]
Li, Jitong [1 ]
Zhang, Xiaowang [1 ]
Feng, Zhiyong [1 ]
Wang, Kewen [3 ]
Yitagesu, Sofonias [4 ]
机构
[1] Tianjin Univ, Tianji, Peoples R China
[2] CNPC Econ & Technol Res Inst, Beijing, Peoples R China
[3] Griffith Univ, Brisbane, Australia
[4] Debre Berhan Univ, Debre Berhan, Ethiopia
基金
中国国家自然科学基金;
关键词
Knowledge graph; Few-shot relation classification; Inverse relation; Function-words;
D O I
10.1007/s11280-023-01142-6
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Relation classification aims to recognize semantic relation between two given entities mentioned in the given text. Existing models have performed well on the inverse relation classification with large-scale datasets, but their performance drops significantly for few-shot learning. In this paper, we propose a Phrase-level Attention Network, function words adaptively enhanced attention framework (FAEA+), to attend class-related function words by the designed hybrid attention for few-shot inverse relation classification in Knowledge Graph. Then, an instance-aware prototype network is present to adaptively capture relation information associated with query instances and eliminate intra-class redundancy due to function words introduced. We theoretically prove that the introduction of function words will increase intra-class differences, and the designed instance-aware prototype network is competent for reducing redundancy. Experimental results show that FAEA+ significantly improved over strong baselines on two few-shot relation classification datasets. Moreover, our model has a distinct advantage in solving inverse relations, which outperforms state-of-the-art results by 16.82% under a 1-shot setting in FewRel1.0.
引用
收藏
页码:3001 / 3026
页数:26
相关论文
共 50 条
  • [1] Phrase-level attention network for few-shot inverse relation classification in knowledge graph
    Shaojuan Wu
    Chunliu Dou
    Dazhuang Wang
    Jitong Li
    Xiaowang Zhang
    Zhiyong Feng
    Kewen Wang
    Sofonias Yitagesu
    World Wide Web, 2023, 26 : 3001 - 3026
  • [2] Transductive Graph-Attention Network for Few-shot Classification
    Pan, Lili
    Liu, Weifeng
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 190 - 195
  • [3] Total Relation Network with Attention for Few-Shot Image Classification
    Li X.-X.
    Liu Z.-Y.
    Wu J.-J.
    Cao J.
    Ma Z.-Y.
    Jisuanji Xuebao/Chinese Journal of Computers, 2023, 46 (02): : 371 - 384
  • [4] Implicit relational attention network for few-shot knowledge graph completion
    Yang, Xu-Hua
    Li, Qi-Yao
    Wei, Dong
    Long, Hai-Xia
    APPLIED INTELLIGENCE, 2024, 54 (08) : 6433 - 6443
  • [5] Generalized Few-Shot Classification with Knowledge Graph
    Liu, Dianqi
    Bai, Liang
    Yu, Tianyuan
    NEURAL PROCESSING LETTERS, 2023, 55 (06) : 7649 - 7666
  • [6] Generalized Few-Shot Classification with Knowledge Graph
    Dianqi Liu
    Liang Bai
    Tianyuan Yu
    Neural Processing Letters, 2023, 55 : 7649 - 7666
  • [7] Cross Attention Network for Few-shot Classification
    Hou, Ruibing
    Chang, Hong
    Ma, Bingpeng
    Shan, Shiguang
    Chen, Xilin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [8] Few-Shot Relation Prediction of Knowledge Graph via Convolutional Neural Network with Self-Attention
    Zhong, Shanna
    Wang, Jiahui
    Yue, Kun
    Duan, Liang
    Sun, Zhengbao
    Fang, Yan
    DATA SCIENCE AND ENGINEERING, 2023, 8 (04) : 385 - 395
  • [9] Relation-Aware Network with Attention-Based Loss for Few-Shot Knowledge Graph Completion
    Qiao, Qiao
    Li, Yuepei
    Kang, Li
    Li, Qi
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT III, 2023, 13937 : 99 - 111
  • [10] Few-Shot Relation Prediction of Knowledge Graph via Convolutional Neural Network with Self-Attention
    Shanna Zhong
    Jiahui Wang
    Kun Yue
    Liang Duan
    Zhengbao Sun
    Yan Fang
    Data Science and Engineering, 2023, 8 (4) : 385 - 395