Enhanced ProtoNet With Self-Knowledge Distillation for Few-Shot Learning

被引:0
|
作者
Habib, Mohamed El Hacen [1 ]
Kucukmanisa, Ayhan [1 ]
Urhan, Oguzhan [1 ]
机构
[1] Kocaeli Univ, Dept Elect & Telecommun Engn, TR-410001 Kocaeli, Turkiye
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Prototypes; Computational modeling; Training; Predictive models; Few shot learning; Adaptation models; Data models; Feature extraction; Training data; Metalearning; Few-shot learning; meta-learning; prototypical networks; self-knowledge distillation;
D O I
10.1109/ACCESS.2024.3472530
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Few-Shot Learning (FSL) has recently gained increased attention for its effectiveness in addressing the problem of data scarcity. Many approaches have been proposed based on the FSL idea, including prototypical networks (ProtoNet). ProtoNet demonstrates its effectiveness in overcoming this issue while providing simplicity in its architecture. On the other hand, the self-knowledge distillation (SKD) technique has become popular in assisting FSL models in achieving good performance by transferring knowledge gained from additional training data. In this work, we apply the self-knowledge distillation technique to ProtoNet to boost its performance. For each task, we compute the prototypes from the few examples (local prototypes) and the many examples (global prototypes) and use the global prototypes to distill knowledge to the few-shot learner model. We employ different distillation techniques based on prototypes, logits, and predictions (soft labels). We evaluated our method using three popular FSL image classification benchmark datasets: CIFAR-FS, CIFAR-FC100, and miniImageNet. Our approach outperformed the baseline and achieved competitive results compared to the state-of-the-art methods, especially on the CIFAR-FC100 dataset.
引用
收藏
页码:145331 / 145340
页数:10
相关论文
共 50 条
  • [1] Prototype-wise self-knowledge distillation for few-shot segmentation
    Chen, Yadang
    Xu, Xinyu
    Wei, Chenchen
    Lu, Chuhan
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2024, 129
  • [2] Few-shot image classification with improved similarity relationships in self-knowledge distillation
    Li, Liang
    Jin, Weidong
    Ren, Junxiao
    Huang, Yingkun
    Yan, Kang
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 7053 - 7058
  • [3] Enhancing the Generalization Performance of Few-Shot Image Classification with Self-Knowledge Distillation
    Li, Liang
    Jin, Weidong
    Huang, Yingkun
    Ren, Junxiao
    STUDIES IN INFORMATICS AND CONTROL, 2022, 31 (02): : 71 - 80
  • [4] Efficient-PrototypicalNet with self knowledge distillation for few-shot learning
    Lim, Jit Yan
    Lim, Kian Ming
    Ooi, Shih Yin
    Lee, Chin Poo
    NEUROCOMPUTING, 2021, 459 : 327 - 337
  • [5] Hierarchical Knowledge Propagation and Distillation for Few-Shot Learning
    Zhou, Chunpeng
    Wang, Haishuai
    Zhou, Sheng
    Yu, Zhi
    Bandara, Danushka
    Bu, Jiajun
    NEURAL NETWORKS, 2023, 167 : 615 - 625
  • [6] Few-shot Learning with Online Self-Distillation
    Liu, Sihan
    Wang, Yue
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2021), 2021, : 1067 - 1070
  • [7] SSL-ProtoNet: Self-supervised Learning Prototypical Networks for few-shot learning
    Lim, Jit Yan
    Lim, Kian Ming
    Lee, Chin Poo
    Tan, Yong Xuan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [8] Contrastive knowledge-augmented self-distillation approach for few-shot learning
    Zhang, Lixu
    Shao, Mingwen
    Chen, Sijie
    Liu, Fukang
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (05)
  • [9] Knowledge Distillation Meets Few-Shot Learning: An Approach for Few-Shot Intent Classification Within and Across Domains
    Sauer, Anna
    Asaadi, Shima
    Kuech, Fabian
    PROCEEDINGS OF THE 4TH WORKSHOP ON NLP FOR CONVERSATIONAL AI, 2022, : 108 - 119
  • [10] Integrating Knowledge Distillation With Learning to Rank for Few-Shot Scene Classification
    Liu, Yishu
    Zhang, Liqiang
    Han, Zhengzhuo
    Chen, Conghui
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60