Latent Coreset Sampling based Data-Free Continual Learning

被引:3
|
作者
Wang, Zhuoyi [1 ]
Li, Dingcheng [1 ]
Li, Ping [1 ]
机构
[1] Baidu Res, Cognit Comp Lab, Bellevue, WA 98004 USA
关键词
Continual Learning; Data-free; Coreset Sampling; Latent representation;
D O I
10.1145/3511808.3557375
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting poses a major challenge in continual learning where the old knowledge is forgotten when the model is updated on new tasks. Existing solutions tend to solve this challenge through generative models or exemplar-replay strategies. However, such methods may not alleviate the issue that the low-quality samples are generated or selected for the replay, which would directly reduce the effectiveness of the model, especially in the class imbalance, noise, or redundancy scenarios. Accordingly, how to select a suitable coreset during continual learning becomes significant in such setting. In this work, we propose a novel approach that leverages continual coreset sampling (CCS) to address these challenges. We aim to select the most representative subsets during each iteration. When the model is trained on new tasks, it closely approximates/matches the gradient of both the previous and current tasks with respect to the model parameters. This way, adaptation of the model to new datasets could be more efficient. Furthermore, different from the old data storage for maintaining the old knowledge, our approach choose to preserving them in the latent space. We augment the previous classes in the embedding space as the pseudo sample vectors from the old encoder output, strengthened by the joint training with selected new data. It could avoid data privacy invasions in a real-world application when we update the model. Our experiments validate the effectiveness of our proposed approach over various CV/NLP datasets under against current baselines, and we also indicate the obvious improvement of model adaptation and forgetting reduction in a data-free manner.
引用
收藏
页码:2078 / 2087
页数:10
相关论文
共 50 条
  • [1] Memory efficient data-free distillation for continual learning
    Li, Xiaorong
    Wang, Shipeng
    Sun, Jian
    Xu, Zongben
    PATTERN RECOGNITION, 2023, 144
  • [2] Variational Data-Free Knowledge Distillation for Continual Learning
    Li, Xiaorong
    Wang, Shipeng
    Sun, Jian
    Xu, Zongben
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12618 - 12634
  • [3] A novel data-free continual learning method with contrastive reversion
    Wu, Chu
    Xie, Runshan
    Wang, Shitong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (02) : 505 - 518
  • [4] A novel data-free continual learning method with contrastive reversion
    Chu Wu
    Runshan Xie
    Shitong Wang
    International Journal of Machine Learning and Cybernetics, 2024, 15 : 505 - 518
  • [5] Decoding BatchNorm statistics via anchors pool for data-free models based on continual learning
    Xiaobin Li
    Weiqiang Wang
    Guangluan Xu
    Neural Computing and Applications, 2025, 37 (6) : 5039 - 5055
  • [6] ConStruct-VL: Data-Free Continual Structured VL Concepts Learning
    Smith, James Seale
    Cascante-Bonilla, Paola
    Arbelle, Assaf
    Kim, Donghyun
    Panda, Rameswar
    Cox, David
    Yang, Diyi
    Kira, Zsolt
    Feris, Rogerio
    Karlinsky, Leonid
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 14994 - 15004
  • [7] CCSI: Continual Class-Specific Impression for data-free class incremental learning
    Ayromlou, Sana
    Tsang, Teresa
    Abolmaesumi, Purang
    Li, Xiaoxiao
    MEDICAL IMAGE ANALYSIS, 2024, 97
  • [8] GCR: Gradient Coreset based Replay Buffer Selection for Continual Learning
    Tiwari, Rishabh
    Killamsetty, Krishnateja
    Iyer, Rishabh
    Shenoy, Pradeep
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 99 - 108
  • [9] Data-Free Learning of Student Networks
    Chen, Hanting
    Wang, Yunhe
    Xu, Chang
    Yang, Zhaohui
    Liu, Chuanjian
    Shi, Boxin
    Xu, Chunjing
    Xu, Chao
    Tian, Qi
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 3513 - 3521
  • [10] Latent Code Augmentation Based on Stable Diffusion for Data-Free Substitute Attacks
    Shao, Mingwen
    Meng, Lingzhuang
    Qiao, Yuanjian
    Zhang, Lixu
    Zuo, Wangmeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025,