Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation

被引:30
|
作者
Zhao, Linglan [1 ]
Lu, Jing [2 ]
Xu, Yunlu [2 ]
Cheng, Zhanzhan [2 ]
Guo, Dashan [1 ]
Niu, Yi [2 ]
Fang, Xiangzhong [1 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Elect Engn, Shanghai, Peoples R China
[2] Hikvis Res Inst, Hangzhou, Zhejiang, Peoples R China
关键词
D O I
10.1109/CVPR52729.2023.01139
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Few-Shot Class-Incremental Learning (FSCIL) aims to continually learn novel classes based on only few training samples, which poses a more challenging task than the well-studied Class-Incremental Learning (CIL) due to data scarcity. While knowledge distillation, a prevailing technique in CIL, can alleviate the catastrophic forgetting of older classes by regularizing outputs between current and previous model, it fails to consider the overfitting risk of novel classes in FSCIL. To adapt the powerful distillation technique for FSCIL, we propose a novel distillation structure, by taking the unique challenge of overfitting into account. Concretely, we draw knowledge from two complementary teachers. One is the model trained on abundant data from base classes that carries rich general knowledge, which can be leveraged for easing the overfitting of current novel classes. The other is the updated model from last incremental session that contains the adapted knowledge of previous novel classes, which is used for alleviating their forgetting. To combine the guidances, an adaptive strategy conditioned on the class-wise semantic similarities is introduced. Besides, for better preserving base class knowledge when accommodating novel concepts, we adopt a two-branch network with an attention-based aggregation module to dynamically merge predictions from two complementary branches. Extensive experiments on 3 popular FSCIL datasets: mini-ImageNet, CIFAR100 and CUB200 validate the effectiveness of our method by surpassing existing works by a significant margin. Code is available at https://github.com/LinglanZhao/BiDistFSCIL.
引用
收藏
页码:11838 / 11847
页数:10
相关论文
共 50 条
  • [21] Few-shot class-incremental learning based on representation enhancement
    Yao, Guangle
    Zhu, Juntao
    Zhou, Wenlong
    Li, Jun
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (04)
  • [22] Decision Boundary Optimization for Few-shot Class-Incremental Learning
    Guo, Chenxu
    Zhao, Qi
    Lyu, Shuchang
    Liu, Binghao
    Wang, Chunlei
    Chen, Lijiang
    Cheng, Guangliang
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3493 - 3503
  • [23] Flexible few-shot class-incremental learning with prototype container
    Xu, Xinlei
    Wang, Zhe
    Fu, Zhiling
    Guo, Wei
    Chi, Ziqiu
    Li, Dongdong
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (15): : 10875 - 10889
  • [24] Few-Shot Class-Incremental Learning for Named Entity Recognition
    Wang, Rui
    Yu, Tong
    Zhao, Handong
    Kim, Sungchul
    Mitra, Subrata
    Zhang, Ruiyi
    Henao, Ricardo
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 571 - 582
  • [25] Memorizing Complementation Network for Few-Shot Class-Incremental Learning
    Ji, Zhong
    Hou, Zhishen
    Liu, Xiyao
    Pang, Yanwei
    Li, Xuelong
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 937 - 948
  • [26] Few-Shot Class-Incremental Learning with Meta-Learned Class Structures
    Zheng, Guangtao
    Zhang, Aidong
    21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 421 - 430
  • [27] Analogical Learning-Based Few-Shot Class-Incremental Learning
    Li, Jiashuo
    Dong, Songlin
    Gong, Yihong
    He, Yuhang
    Wei, Xing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (07) : 5493 - 5504
  • [28] Rethinking Few-Shot Class-Incremental Learning: Learning from Yourself
    Tang, Yu-Ming
    Peng, Yi-Xing
    Meng, Jingke
    Zheng, Wei-Shi
    COMPUTER VISION - ECCV 2024, PT LXI, 2025, 15119 : 108 - 128
  • [29] Prompt-based learning for few-shot class-incremental learning
    Yuan, Jicheng
    Chen, Hang
    Tian, Songsong
    Li, Wenfa
    Li, Lusi
    Ning, Enhao
    Zhang, Yugui
    ALEXANDRIA ENGINEERING JOURNAL, 2025, 120 : 287 - 295
  • [30] Rethinking few-shot class-incremental learning: A lazy learning baseline
    Qin, Zhili
    Han, Wei
    Liu, Jiaming
    Zhang, Rui
    Yang, Qingli
    Sun, Zejun
    Shao, Junming
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250