CCSI: Continual Class-Specific Impression for data-free class incremental learning

被引:1
|
作者
Ayromlou, Sana [1 ,3 ]
Tsang, Teresa [2 ]
Abolmaesumi, Purang [1 ]
Li, Xiaoxiao [1 ,3 ]
机构
[1] Univ British Columbia, Elect & Comp Engn Dept, Vancouver, BC V6T 1Z4, Canada
[2] Vancouver Gen Hosp, Vancouver, BC V5Z 1M9, Canada
[3] Vector Inst, Toronto, ON M5G 0C6, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Class incremental learning; Data synthesis; Echo-cardiograms; Computed tomography; Microscopy imaging; CLASSIFICATION;
D O I
10.1016/j.media.2024.103239
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In real -world clinical settings, traditional deep learning -based classification methods struggle with diagnosing newly introduced disease types because they require samples from all disease classes for offline training. Class incremental learning offers a promising solution by adapting a deep network trained on specific disease classes to handle new diseases. However, catastrophic forgetting occurs, decreasing the performance of earlier classes when adapting the model to new data. Prior proposed methodologies to overcome this require perpetual storage of previous samples, posing potential practical concerns regarding privacy and storage regulations in healthcare. To this end, we propose a novel data -free class incremental learning framework that utilizes data synthesis on learned classes instead of data storage from previous classes. Our key contributions include acquiring synthetic data known as Continual Class -Specific Impression (CCSI) for previously inaccessible trained classes and presenting a methodology to effectively utilize this data for updating networks when introducing new classes. We obtain CCSI by employing data inversion over gradients of the trained classification model on previous classes starting from the mean image of each class inspired by common landmarks shared among medical images and utilizing continual normalization layers statistics as a regularizer in this pixelwise optimization process. Subsequently, we update the network by combining the synthesized data with new class data and incorporate several losses, including an intra-domain contrastive loss to generalize the deep network trained on the synthesized data to real data, a margin loss to increase separation among previous classes and new ones, and a cosine -normalized cross -entropy loss to alleviate the adverse effects of imbalanced distributions in training data. Extensive experiments show that the proposed framework achieves state-of-theart performance on four of the public MedMNIST datasets and in-house echocardiography cine series, with an improvement in classification accuracy of up to 51% compared to baseline data -free methods. Our code is available at https://github.com/ubc-tea/Continual-Impression-CCSI.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Correction to: Learning class-specific word embeddings
    Sicong Kuang
    Brian D. Davison
    The Journal of Supercomputing, 2020, 76 : 8293 - 8293
  • [22] Class-specific nonlinear projections using class-specific kernel spaces
    Iosifidis, Alexandros
    Gabbouj, Moncef
    Pekki, Petri
    2015 IEEE TRUSTCOM/BIGDATASE/ISPA, VOL 2, 2015, : 17 - 24
  • [23] Learning class-specific affinities for image labelling
    Batra, Dhruv
    Sukthankar, Rahul
    Chen, Tsuhan
    2008 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-12, 2008, : 703 - 710
  • [24] CLASS-SPECIFIC NONLINEAR SUBSPACE LEARNING BASED ON OPTIMIZED CLASS REPRESENTATION
    Iosifidis, Alexandros
    Tefas, Anastasios
    Pitas, Ioannis
    2015 23RD EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2015, : 2491 - 2495
  • [25] Continual Unsupervised Domain Adaptation for Semantic Segmentation using a Class-Specific Transfer
    Marsden, Robert A.
    Wiewel, Felix
    Doebler, Mario
    Yang, Yang
    Yang, Bin
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [26] CLASS-SPECIFIC CHANNEL ATTENTION FOR FEW SHOT LEARNING
    Hsieh, Yi-Kuan
    Hsieh, Jun-Wei
    Chen, Ying-Yu
    2024 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2024, : 1012 - 1018
  • [27] Minimum class variance class-specific extreme learning machine for imbalanced classification
    Raghuwanshi, Bhagat Singh
    Shukla, Sanyam
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 178
  • [28] Class-specific extreme learning machine for handling binary class imbalance problem
    Raghuwanshi, Bhagat Singh
    Shukla, Sanyam
    NEURAL NETWORKS, 2018, 105 : 206 - 217
  • [29] Class-specific sparse coding for learning of object representations
    Hasler, S
    Wersing, H
    Körner, E
    ARTIFICIAL NEURAL NETWORKS: BIOLOGICAL INSPIRATIONS - ICANN 2005, PT 1, PROCEEDINGS, 2005, 3696 : 475 - 480
  • [30] Class-specific ensembles for active learning in digital imagery
    Mandvikar, A
    Liu, H
    PROCEEDINGS OF THE FOURTH SIAM INTERNATIONAL CONFERENCE ON DATA MINING, 2004, : 412 - 421