Label smoothing and task-adaptive loss function based on prototype network for few-shot learning

被引:14
|
作者
Gao, Farong [1 ]
Luo, Xingsheng [1 ]
Yang, Zhangyi [1 ]
Zhang, Qizhong [1 ,2 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Peoples R China
[2] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Peoples R China
关键词
Flexible hyperparameters; Improved loss function; Few-shot learning; Image classification; Deep learning; CLASSIFICATION;
D O I
10.1016/j.neunet.2022.09.018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aiming at solving the problems of prototype network that the label information is not reliable enough and that the hyperparameters of the loss function cannot follow the changes of image feature information, we propose a method that combines label smoothing and hyperparameters. First, the label information of an image is processed by label smoothing regularization. Then, according to different classification tasks, the distance matrix and logarithmic operation of the image feature are used to fuse the distance matrix of the image with the hyperparameters of the loss function. Finally, the hyperparameters are associated with the smoothed label and the distance matrix for predictive classification. The method is validated on the miniImageNet, FC100 and tieredImageNet datasets. The results show that, compared with the unsmoothed label and fixed hyperparameters methods, the classification accuracy of the flexible hyperparameters in the loss function under the condition of few-shot learning is improved by 2%-3%. The result shows that the proposed method can suppress the interference of false labels, and the flexibility of hyperparameters can improve classification accuracy.(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页码:39 / 48
页数:10
相关论文
共 50 条
  • [1] Label smoothing and task-adaptive loss function based on prototype network for few-shot learning
    Gao, Farong
    Luo, Xingsheng
    Yang, Zhangyi
    Zhang, Qizhong
    Neural Networks, 2022, 156 : 39 - 48
  • [2] Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning
    Baik, Sungyong
    Choi, Janghoon
    Kim, Heewon
    Cho, Dohee
    Min, Jaesik
    Lee, Kyoung Mu
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9445 - 9454
  • [3] Task-adaptive Relation Dependent Network for Few-shot Learning
    He, Xi
    Li, Fanzhang
    Liu, Li
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [4] Task-adaptive Few-shot Learning on Sphere Manifold
    He, Xi
    Li, Fanzhang
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2949 - 2956
  • [5] Learning to Learn Task-Adaptive Hyperparameters for Few-Shot Learning
    Baik, Sungyong
    Choi, Myungsub
    Choi, Janghoon
    Kim, Heewon
    Lee, Kyoung Mu
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1441 - 1454
  • [6] TapNet: Neural Network Augmented with Task-Adaptive Projection for Few-Shot Learning
    Yoon, Sung Whan
    Seo, Jun
    Moon, Jaekyun
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [7] Few-shot classification with task-adaptive semantic feature learning
    Pan, Mei-Hong
    Xin, Hong-Yi
    Xia, Chun-Qiu
    Shen, Hong -Bin
    PATTERN RECOGNITION, 2023, 141
  • [8] Task-Adaptive Few-shot Node Classification
    Wang, Song
    Ding, Kaize
    Zhang, Chuxu
    Chen, Chen
    Li, Jundong
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1910 - 1919
  • [9] Task-adaptive Label Dependency Transfer for Few-shot Named Entity Recognition
    Zhang, Shan
    Cao, Bin
    Zhang, Tianming
    Liu, Yuqi
    Fan, Jing
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 3280 - 3293
  • [10] ZooKT: Task-adaptive knowledge transfer of Model Zoo for few-shot learning
    Zhang, Baoquan
    Shan, Bingqi
    Li, Aoxue
    Luo, Chuyao
    Ye, Yunming
    Li, Zhenguo
    PATTERN RECOGNITION, 2025, 158