Class Impression for Data-Free Incremental Learning

被引:2
|
作者
Ayromlou, Sana [1 ]
Abolmaesumi, Purang [1 ]
Tsang, Teresa [2 ]
Li, Xiaoxiao [1 ]
机构
[1] Univ British Columbia, Vancouver, BC, Canada
[2] Vancouver Gen Hosp, Vancouver, BC, Canada
基金
加拿大自然科学与工程研究理事会; 加拿大健康研究院;
关键词
D O I
10.1007/978-3-031-16440-8_31
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Standard deep learning-based classification approaches require collecting all samples from all classes in advance and are trained offline. This paradigm may not be practical in real-world clinical applications, where new classes are incrementally introduced through the addition of new data. Class incremental learning is a strategy allowing learning from such data. However, a major challenge is catastrophic forgetting, i.e., performance degradation on previous classes when adapting a trained model to new data. To alleviate this challenge, prior methodologies save a portion of training data that require perpetual storage, which may introduce privacy issues. Here, we propose a novel data-free class incremental learning framework that first synthesizes data from the model trained on previous classes to generate a Class Impression. Subsequently, it updates the model by combining the synthesized data with new class data. Furthermore, we incorporate a cosine normalized Cross-entropy loss to mitigate the adverse effects of the imbalance, a margin loss to increase separation among previous classes and new ones, and an infra-domain contrastive loss to generalize the model trained on the synthesized data to real data. We compare our proposed framework with state-of-the-art methods in class incremental learning, where we demonstrate improvement in accuracy for the classification of 11,062 echocardiography cine series of patients. Code is available at https://github.com/sanaAyrml/Class-Impresion-for-Data-free-Incremental-Learning
引用
收藏
页码:320 / 329
页数:10
相关论文
共 50 条
  • [31] Data-Free/Data-Sparse Softmax Parameter Estimation With Structured Class Geometries
    Ahmed, Nisar
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (09) : 1408 - 1412
  • [32] Learning to Generate Diverse Data From a Temporal Perspective for Data-Free Quantization
    Luo, Hui
    Zhang, Shuhai
    Zhuang, Zhuangwei
    Mai, Jiajie
    Tan, Mingkui
    Zhang, Jianlin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 9484 - 9498
  • [33] Data-Free Model Extraction
    Truong, Jean-Baptiste
    Maini, Pratyush
    Walls, Robert J.
    Papernot, Nicolas
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 4769 - 4778
  • [34] Adaptive Data-Free Quantization
    Qian, Biao
    Wang, Yang
    Hong, Richang
    Wang, Meng
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7960 - 7968
  • [35] Robust Heterogeneous Federated Learning via Data-Free Knowledge Amalgamation
    Ma, Jun
    Fan, Zheng
    Fan, Chaoyu
    Kang, Qi
    ADVANCES IN SWARM INTELLIGENCE, PT II, ICSI 2024, 2024, 14789 : 61 - 71
  • [36] Ask, Acquire, and Attack: Data-Free UAP Generation Using Class Impressions
    Mopuri, Konda Reddy
    Uppala, Phani Krishna
    Babu, R. Venkatesh
    COMPUTER VISION - ECCV 2018, PT IX, 2018, 11213 : 20 - 35
  • [37] Data-Free Learning for Lightweight Multi-Weather Image Restoration
    Wang, Pei
    Huang, Hongzhan
    Luo, Xiaotong
    Qu, Yanyun
    2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [38] Relation-Guided Adversarial Learning for Data-Free Knowledge Transfer
    Liang, Yingping
    Fu, Ying
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2024, : 2868 - 2885
  • [39] A Category-Aware Curriculum Learning for Data-Free Knowledge Distillation
    Li, Xiufang
    Jiao, Licheng
    Sun, Qigong
    Liu, Fang
    Liu, Xu
    Li, Lingling
    Chen, Puhua
    Yang, Shuyuan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 9603 - 9618
  • [40] Network-light not data-free
    French, Katherine M.
    Riley, Steven
    Garnett, Geoff P.
    SEXUALLY TRANSMITTED DISEASES, 2007, 34 (01) : 57 - 58