Class Impression for Data-Free Incremental Learning

被引:2
|
作者
Ayromlou, Sana [1 ]
Abolmaesumi, Purang [1 ]
Tsang, Teresa [2 ]
Li, Xiaoxiao [1 ]
机构
[1] Univ British Columbia, Vancouver, BC, Canada
[2] Vancouver Gen Hosp, Vancouver, BC, Canada
基金
加拿大自然科学与工程研究理事会; 加拿大健康研究院;
关键词
D O I
10.1007/978-3-031-16440-8_31
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Standard deep learning-based classification approaches require collecting all samples from all classes in advance and are trained offline. This paradigm may not be practical in real-world clinical applications, where new classes are incrementally introduced through the addition of new data. Class incremental learning is a strategy allowing learning from such data. However, a major challenge is catastrophic forgetting, i.e., performance degradation on previous classes when adapting a trained model to new data. To alleviate this challenge, prior methodologies save a portion of training data that require perpetual storage, which may introduce privacy issues. Here, we propose a novel data-free class incremental learning framework that first synthesizes data from the model trained on previous classes to generate a Class Impression. Subsequently, it updates the model by combining the synthesized data with new class data. Furthermore, we incorporate a cosine normalized Cross-entropy loss to mitigate the adverse effects of the imbalance, a margin loss to increase separation among previous classes and new ones, and an infra-domain contrastive loss to generalize the model trained on the synthesized data to real data. We compare our proposed framework with state-of-the-art methods in class incremental learning, where we demonstrate improvement in accuracy for the classification of 11,062 echocardiography cine series of patients. Code is available at https://github.com/sanaAyrml/Class-Impresion-for-Data-free-Incremental-Learning
引用
收藏
页码:320 / 329
页数:10
相关论文
共 50 条
  • [41] ConStruct-VL: Data-Free Continual Structured VL Concepts Learning
    Smith, James Seale
    Cascante-Bonilla, Paola
    Arbelle, Assaf
    Kim, Donghyun
    Panda, Rameswar
    Cox, David
    Yang, Diyi
    Kira, Zsolt
    Feris, Rogerio
    Karlinsky, Leonid
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 14994 - 15004
  • [42] Heterogeneous inverse design for adsorption desalination via data-free deep learning
    Li, Mingliang
    Zhao, Yanan
    Long, Rui
    Liu, Zhichun
    Liu, Wei
    APPLIED THERMAL ENGINEERING, 2025, 263
  • [43] Privacy-Preserving Student Learning with Differentially Private Data-Free Distillation
    Liu, Bochao
    Lu, Jianghu
    Wang, Pengju
    Zhang, Junjie
    Zeng, Dan
    Qian, Zhenxing
    Ge, Shiming
    2022 IEEE 24TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2022,
  • [44] Semantic consistency learning on manifold for source data-free unsupervised domain adaptation
    Tang, Song
    Zou, Yan
    Song, Zihao
    Lyu, Jianzhi
    Chen, Lijuan
    Ye, Mao
    Zhong, Shouming
    Zhang, Jianwei
    NEURAL NETWORKS, 2022, 152 : 467 - 478
  • [45] Dynamic data-free knowledge distillation by easy-to-hard learning strategy
    Li, Jingru
    Zhou, Sheng
    Li, Liangcheng
    Wang, Haishuai
    Bu, Jiajun
    Yu, Zhi
    INFORMATION SCIENCES, 2023, 642
  • [46] Data-free knowledge distillation via generator-free data generation for Non-IID federated learning
    Zhao, Siran
    Liao, Tianchi
    Fu, Lele
    Chen, Chuan
    Bian, Jing
    Zheng, Zibin
    NEURAL NETWORKS, 2024, 179
  • [47] A Map of SPRINT's Data-free Zone
    Laffin, Luke J.
    Besser, Stephanie A.
    Alenghat, Francis J.
    HYPERTENSION, 2017, 70
  • [48] Class Imbalance Robust Incremental LPSVM for Data Streams Learning
    Zhu, Lei
    Pang, Shaoning
    Chen, Gang
    Sarrafzadeh, Abdolhossein
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [49] Deep Class-Incremental Learning From Decentralized Data
    Zhang, Xiaohan
    Dong, Songlin
    Chen, Jinjie
    Tian, Qi
    Gong, Yihong
    Hong, Xiaopeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (05) : 7190 - 7203
  • [50] An Incremental One Class Learning Framework for Large Scale Data
    Deng, Qilin
    Yang, Yi
    Shen, Furao
    Luo, Chaomin
    Zhao, Jinxi
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 411 - 418