Attention-Based Graph Convolutional Network for Zero-Shot Learning with Pre-Training

被引:0
|
作者
Wu, Xuefei [1 ]
Liu, Mingjiang [1 ]
Xin, Bo [1 ]
Zhu, Zhangqing [1 ]
Wang, Gang [2 ]
机构
[1] Nanjing Univ, Sch Management & Engn, Dept Control & Syst Engn, Nanjing 210093, Peoples R China
[2] Minist Agr & Rural Area, Nanjing Res Inst Agr Mechanizat, Nanjing 210014, Peoples R China
基金
中国国家自然科学基金;
关键词
All Open Access; Gold;
D O I
10.1155/2021/7480712
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Zero-shot learning (ZSL) is a powerful and promising learning paradigm for classifying instances that have not been seen in training. Although graph convolutional networks (GCNs) have recently shown great potential for the ZSL tasks, these models cannot adjust the constant connection weights between the nodes in knowledge graph and the neighbor nodes contribute equally to classify the central node. In this study, we apply an attention mechanism to adjust the connection weights adaptively to learn more important information for classifying unseen target nodes. First, we propose an attention graph convolutional network for zero-shot learning (AGCNZ) by integrating the attention mechanism and GCN directly. Then, in order to prevent the dilution of knowledge from distant nodes, we apply the dense graph propagation (DGP) model for the ZSL tasks and propose an attention dense graph propagation model for zero-shot learning (ADGPZ). Finally, we propose a modified loss function with a relaxation factor to further improve the performance of the learned classifier. Experimental results under different pre-training settings verified the effectiveness of the proposed attention-based models for ZSL.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Zero-shot Key Information Extraction from Mixed-Style Tables: Pre-training on Wikipedia
    Yang, Qingping
    Hu, Yingpeng
    Cao, Rongyu
    Li, Hongwei
    Luo, Ping
    Proceedings - IEEE International Conference on Data Mining, ICDM, 2021, 2021-December : 1451 - 1456
  • [42] Zero-shot Key Information Extraction from Mixed-Style Tables: Pre-training on Wikipedia
    Yang, Qingping
    Hu, Yingpeng
    Cao, Rongyu
    Li, Hongwei
    Luo, Ping
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1451 - 1456
  • [43] Semantic-Adversarial Graph Convolutional Network for Zero-Shot Cross-Modal Retrieval
    Li, Chuang
    Fei, Lunke
    Kang, Peipei
    Liang, Jiahao
    Fang, Xiaozhao
    Teng, Shaohua
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2022, 13630 : 459 - 472
  • [44] Implicit and explicit attention mechanisms for zero-shot learning
    Alamri, Faisal
    Dutta, Anjan
    NEUROCOMPUTING, 2023, 534 : 55 - 66
  • [45] Attributes learning network for generalized zero-shot learning
    Yun, Yu
    Wang, Sen
    Hou, Mingzhen
    Gao, Quanxue
    NEURAL NETWORKS, 2022, 150 : 112 - 118
  • [46] Attribute Attention for Semantic Disambiguation in Zero-Shot Learning
    Liu, Yang
    Guo, Jishun
    Cai, Deng
    He, Xiaofei
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6697 - 6706
  • [47] Differential Refinement Network for Zero-Shot Learning
    Tian, Yi
    Zhang, Yilei
    Huang, Yaping
    Xu, Wanru
    Ding, Zhengming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (03) : 4164 - 4178
  • [48] Attention-Based Multiple Graph Convolutional Recurrent Network for Traffic Forecasting
    Liu, Lu
    Cao, Yibo
    Dong, Yuhan
    SUSTAINABILITY, 2023, 15 (06)
  • [49] Hierarchical Zero-Shot Classification with Convolutional Neural Network Features and Semantic Attribute Learning
    Markowitz, Jared
    Schmidt, Aurora C.
    Burlina, Philippe M.
    Wang, I-Jeng
    PROCEEDINGS OF THE FIFTEENTH IAPR INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS - MVA2017, 2017, : 194 - 197
  • [50] Dual Bidirectional Graph Convolutional Networks for Zero-shot Node Classification
    Yue, Qin
    Liang, Jiye
    Cui, Junbiao
    Bai, Liang
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 2408 - 2417