Imbalance Mitigation for Continual Learning via Knowledge Decoupling and Dual Enhanced Contrastive Learning

被引:1
|
作者
Ji, Zhong [1 ,2 ]
Jiao, Zhanyu [1 ]
Wang, Qiang [1 ]
Pang, Yanwei [1 ,2 ]
Han, Jungong [3 ]
机构
[1] Tianjin Univ, Sch Elect & Informat Engn, Tianjin 300072, Peoples R China
[2] Shanghai Artificial Intelligence Lab, Shanghai 200232, Peoples R China
[3] Univ Sheffield, Dept Comp Sci, Sheffield S10 2TG, S Yorkshire, England
基金
中国国家自然科学基金;
关键词
Catastrophic forgetting; continual learning (CL); experience replay (ER); image classification;
D O I
10.1109/TNNLS.2023.3347477
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning (CL) aims at studying how to learn new knowledge continuously from data streams without catastrophically forgetting the previous knowledge. One of the key problems is catastrophic forgetting, that is, the performance of the model on previous tasks declines significantly after learning the subsequent task. Several studies addressed it by replaying samples stored in the buffer when training new tasks. However, the data imbalance between old and new task samples results in two serious problems: information suppression and weak feature discriminability. The former refers to the information in the sufficient new task samples suppressing that in the old task samples, which is harmful to maintaining the knowledge since the biased output worsens the consistency of the same sample's output at different moments. The latter refers to the feature representation being biased to the new task, which lacks discrimination to distinguish both old and new tasks. To this end, we build an imbalance mitigation for CL (IMCL) framework that incorporates a decoupled knowledge distillation (DKD) approach and a dual enhanced contrastive learning (DECL) approach to tackle both problems. Specifically, the DKD approach alleviates the suppression of the new task on the old tasks by decoupling the model output probability during the replay stage, which better maintains the knowledge of old tasks. The DECL approach enhances both low- and high-level features and fuses the enhanced features to construct contrastive loss to effectively distinguish different tasks. Extensive experiments on three popular datasets show that our method achieves promising performance under task incremental learning (Task-IL), class incremental learning (Class-IL), and domain incremental learning (Domain-IL) settings.
引用
收藏
页码:3450 / 3463
页数:14
相关论文
共 50 条
  • [21] Review-enhanced contrastive learning on knowledge graphs for recommendation
    Liu, Yun
    Kertkeidkachorn, Natthawut
    Miyazaki, Jun
    Ichise, Ryutaro
    Expert Systems with Applications, 2025, 277
  • [22] Knowledge enhanced graph contrastive learning for match outcome prediction
    Jiang, Junji
    Wu, Likang
    Hu, Zhipeng
    Wu, Runze
    Shen, Xudong
    Zhao, Hongke
    INFORMATION PROCESSING & MANAGEMENT, 2025, 62 (03)
  • [23] Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning
    Wu, Huisi
    Wang, Zhaoze
    Zhao, Zebin
    Chen, Cheng
    Qin, Jing
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (12) : 3794 - 3804
  • [24] CCL: Continual Contrastive Learning for LiDAR Place Recognition
    Cui, Jiafeng
    Chen, Xieyuanli
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (08) : 4433 - 4440
  • [25] Gradient Regularized Contrastive Learning for Continual Domain Adaptation
    Tang, Shixiang
    Su, Peng
    Chen, Dapeng
    Ouyang, Wanli
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 2665 - 2673
  • [26] Contrastive Correlation Preserving Replay for Online Continual Learning
    Yu, Da
    Zhang, Mingyi
    Li, Mantian
    Zha, Fusheng
    Zhang, Junge
    Sun, Lining
    Huang, Kaiqi
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (01) : 124 - 139
  • [27] A Contrastive Continual Learning for the Classification of Remote Sensing Imagery
    Alakooz, Abdulaziz S.
    Ammour, Nassim
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 7902 - 7905
  • [28] Co2L: Contrastive Continual Learning
    Cha, Hyuntak
    Lee, Jaeho
    Shin, Jinwoo
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9496 - 9505
  • [29] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [30] MediDRNet: Tackling category imbalance in diabetic retinopathy classification with dual-branch learning and prototypical contrastive learning
    Teng, Siying
    Wang, Bo
    Yang, Feiyang
    Yi, Xingcheng
    Zhang, Xinmin
    Sun, Yabin
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2024, 253