SAPENet: Self-Attention based Prototype Enhancement Network for Few-shot Learning

被引:35
|
作者
Huang, Xilang [1 ]
Choi, Seon Han [2 ,3 ]
机构
[1] Pukyong Natl Univ, Dept Artificial Intelligent Convergence, Pusan 48513, South Korea
[2] Ewha Womans Univ, Dept Elect & Elect Engn, Seoul 03760, South Korea
[3] Ewha Womans Univ, Grad Program Smart Factory, Seoul 03760, South Korea
基金
新加坡国家研究基金会;
关键词
Few -shot learning; Multi -head self -attention mechanism; Image classification; k -Nearest neighbor;
D O I
10.1016/j.patcog.2022.109170
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Few-shot learning considers the problem of learning unseen categories given only a few labeled samples. As one of the most popular few-shot learning approaches, Prototypical Networks have received considerable attention owing to their simplicity and efficiency. However, a class prototype is typically obtained by averaging a few labeled samples belonging to the same class, which treats the samples as equally important and is thus prone to learning redundant features. Herein, we propose a self-attention based prototype enhancement network (SAPENet) to obtain a more representative prototype for each class. SAPENet utilizes multi-head self-attention mechanisms to selectively augment discriminative features in each sample feature map, and generates channel attention maps between intra-class sample features to attentively retain informative channel features for that class. The augmented feature maps and attention maps are finally fused to obtain representative class prototypes. Thereafter, a local descriptor-based metric module is employed to fully exploit the channel information of the prototypes by searching k similar local descriptors of the prototype for each local descriptor in the unlabeled samples for classification. We performed experiments on multiple benchmark datasets: miniImageNet, tieredImageNet, and CUB-200-2011. The experimental results on these datasets show that SAPENet achieves a considerable improvement compared to Prototypical Networks and also outperforms related state-of-the-art methods.(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Contrastive prototype network with prototype augmentation for few-shot classification
    Jiang, Mengjuan
    Fan, Jiaqing
    He, Jiangzhen
    Du, Weidong
    Wang, Yansong
    Li, Fanzhang
    INFORMATION SCIENCES, 2025, 686
  • [42] Few-shot object detection with semantic enhancement and semantic prototype contrastive learning
    Huang, Lian
    Dai, Shaosheng
    He, Ziqiang
    KNOWLEDGE-BASED SYSTEMS, 2022, 252
  • [43] Cross-attention based dual-similarity network for few-shot learning
    Sim, Chan
    Kim, Gyeonghwan
    PATTERN RECOGNITION LETTERS, 2024, 186 : 1 - 6
  • [44] Local descriptor-based spatial cross attention network for few-shot learning
    Huang, Jiamin
    Zhao, Lina
    Yang, Hongwei
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (10) : 4747 - 4759
  • [45] Few-Shot Malware Classification via Attention-Based Transductive Learning Network
    Deng, Liting
    Yu, Chengli
    Wen, Hui
    Xin, Mingfeng
    Sun, Yue
    Sun, Limin
    Zhu, Hongsong
    MOBILE NETWORKS & APPLICATIONS, 2024, : 1690 - 1704
  • [46] Self-Calibrated Cross Attention Network for Few-Shot Segmentation
    Xu, Qianxiong
    Zhao, Wenting
    Lin, Guosheng
    Long, Cheng
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 655 - 665
  • [47] Cross Attention Network for Few-shot Classification
    Hou, Ruibing
    Chang, Hong
    Ma, Bingpeng
    Shan, Shiguang
    Chen, Xilin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [48] Multi-level Attention Feature Network for Few-shot Learning
    Wang R.
    Han M.
    Yang J.
    Xue L.
    Hu M.
    Yang, Juan (yangjuan@hfut.edu.cn), 1600, Science Press (42): : 772 - 778
  • [49] Multi-level Attention Feature Network for Few-shot Learning
    Wang Ronggui
    Han Mengya
    Yang Juan
    Xue Lixia
    Hu Min
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2020, 42 (03) : 772 - 778
  • [50] Critic Boosting Attention Network on Local Descriptor for Few-shot Learning
    Shi, Chengzhang
    Own, Chung-Ming
    Chou, Ching-chih
    Guo, Bailu
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,