Latent Coreset Sampling based Data-Free Continual Learning

被引:3
|
作者
Wang, Zhuoyi [1 ]
Li, Dingcheng [1 ]
Li, Ping [1 ]
机构
[1] Baidu Res, Cognit Comp Lab, Bellevue, WA 98004 USA
关键词
Continual Learning; Data-free; Coreset Sampling; Latent representation;
D O I
10.1145/3511808.3557375
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting poses a major challenge in continual learning where the old knowledge is forgotten when the model is updated on new tasks. Existing solutions tend to solve this challenge through generative models or exemplar-replay strategies. However, such methods may not alleviate the issue that the low-quality samples are generated or selected for the replay, which would directly reduce the effectiveness of the model, especially in the class imbalance, noise, or redundancy scenarios. Accordingly, how to select a suitable coreset during continual learning becomes significant in such setting. In this work, we propose a novel approach that leverages continual coreset sampling (CCS) to address these challenges. We aim to select the most representative subsets during each iteration. When the model is trained on new tasks, it closely approximates/matches the gradient of both the previous and current tasks with respect to the model parameters. This way, adaptation of the model to new datasets could be more efficient. Furthermore, different from the old data storage for maintaining the old knowledge, our approach choose to preserving them in the latent space. We augment the previous classes in the embedding space as the pseudo sample vectors from the old encoder output, strengthened by the joint training with selected new data. It could avoid data privacy invasions in a real-world application when we update the model. Our experiments validate the effectiveness of our proposed approach over various CV/NLP datasets under against current baselines, and we also indicate the obvious improvement of model adaptation and forgetting reduction in a data-free manner.
引用
收藏
页码:2078 / 2087
页数:10
相关论文
共 50 条
  • [11] Customizing Synthetic Data for Data-Free Student Learning
    Luo, Shiya
    Chen, Defang
    Wang, Can
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1817 - 1822
  • [12] Class Impression for Data-Free Incremental Learning
    Ayromlou, Sana
    Abolmaesumi, Purang
    Tsang, Teresa
    Li, Xiaoxiao
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT IV, 2022, 13434 : 320 - 329
  • [13] FREE: Faster and Better Data-Free Meta-Learning
    Wei, Yongxian
    Hui, Zixuan
    Wang, Zhenyi
    Shen, Li
    Yuan, Chun
    Tao, Dacheng
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 23273 - 23282
  • [14] Data-Free Learning of Reduced-Order Kinematics
    Sharp, Nicholas
    Romero, Cristian
    Jacobson, Alec
    Vouga, Etienne
    Kry, Paul G.
    Levin, David I. W.
    Solomon, Justin
    PROCEEDINGS OF SIGGRAPH 2023 CONFERENCE PAPERS, SIGGRAPH 2023, 2023,
  • [15] Data-Free Generalized Zero-Shot Learning
    Tang, Bowen
    Zhang, Jing
    Yan, Long
    Yu, Qian
    Sheng, Lu
    Xu, Dong
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 6, 2024, : 5108 - 5117
  • [16] Data-Free Evaluation of User Contributions in Federated Learning
    Lv, Hongtao
    Zheng, Zhenzhe
    Luo, Tie
    Wu, Fan
    Tang, Shaojie
    Hua, Lifeng
    Jie, Rongfei
    Lv, Chengfei
    2021 19TH INTERNATIONAL SYMPOSIUM ON MODELING AND OPTIMIZATION IN MOBILE, AD HOC, AND WIRELESS NETWORKS (WIOPT), 2021,
  • [17] Data-free adaptive structured pruning for federated learning
    Fan, Wei
    Yang, Keke
    Wang, Yifan
    Chen, Cong
    Li, Jing
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (13): : 18600 - 18626
  • [18] Communication Efficient Coreset Sampling for Distributed Learning
    Fan, Yawen
    Li, Husheng
    2018 IEEE 19TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (SPAWC), 2018, : 76 - 80
  • [19] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [20] DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning
    Luo, Kangyang
    Wang, Shuai
    Fu, Yexuan
    Li, Xiang
    Lan, Yunshi
    Gao, Ming
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,