Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation

被引:0
|
作者
Baek, Donghyeon [1 ]
Oh, Youngmin [1 ]
Lee, Sanghoon [1 ]
Lee, Junghyup [1 ]
Ham, Bumsub [1 ,2 ]
机构
[1] Yonsei Univ, Seoul, South Korea
[2] Korea Inst Sci & Technol KIST, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental semantic segmentation (CISS) labels each pixel of an image with a corresponding object/stuff class continually. To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms. They quantify how likely an input belongs to a particular class or not, providing a clue for a reasoning process of a model. The KD technique, in this context, preserves the sum of two terms (i.e., a class logit), suggesting that each could be changed and thus the KD does not imitate the reasoning process. To impose constraints on each term explicitly, we propose a new decomposed knowledge distillation (DKD) technique, improving the rigidity of a model and addressing the forgetting problem more effectively. We also introduce a novel initialization method to train new classifiers for novel classes. In CISS, the number of negative training samples for novel classes is not sufficient to discriminate old classes. To mitigate this, we propose to transfer knowledge of negatives to the classifiers successively using an auxiliary classifier, boosting the performance significantly. Experimental results on standard CISS benchmarks demonstrate the effectiveness of our framework.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Adaptively forget with crossmodal and textual distillation for class-incremental video captioning
    Xiong, Huiyu
    Wang, Lanxiao
    Qiu, Heqian
    Zhao, Taijin
    Qiu, Benliu
    Li, Hongliang
    NEUROCOMPUTING, 2025, 624
  • [42] WEAKLY-SUPERVISED CONTINUAL LEARNING FOR CLASS-INCREMENTAL SEGMENTATION
    Lenczner, Gaston
    Chan-Hon-Tong, Adrien
    Luminari, Nicola
    Le Saux, Bertrand
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 4843 - 4846
  • [43] MiSSNet: Memory-Inspired Semantic Segmentation Augmentation Network for Class-Incremental Learning in Remote Sensing Images
    Xie, Jiajun
    Pan, Bin
    Xu, Xia
    Shi, Zhenwei
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 13
  • [44] Inherit With Distillation and Evolve With Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory
    Zhao, Danpei
    Yuan, Bo
    Shi, Zhenwei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 11932 - 11947
  • [45] Class-Incremental Learning for Semantic Segmentation Re-Using Neither Old Data Nor Old Labels
    Klingner, Marvin
    Baer, Andreas
    Donn, Philipp
    Fingscheidt, Tim
    2020 IEEE 23RD INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2020,
  • [46] Uncertainty-Guided Semi-Supervised Few-Shot Class-Incremental Learning With Knowledge Distillation
    Cui, Yawen
    Deng, Wanxia
    Xu, Xin
    Liu, Zhen
    Liu, Zhong
    Pietikainen, Matti
    Liu, Li
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 6422 - 6435
  • [47] Channel Affinity Knowledge Distillation for Semantic Segmentation
    Li, Huakun
    Zhang, Yuhang
    Tian, Shishun
    Cheng, Pengfei
    You, Rong
    Zou, Wenbin
    2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [48] EndpointsWeight Fusion for Class Incremental Semantic Segmentation
    Xiao, Jia-Wen
    Zhang, Chang-Bin
    Feng, Jiekang
    Liu, Xialei
    van de Weijer, Joost
    Cheng, Ming-Ming
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7204 - 7213
  • [49] Dataset Knowledge Transfer for Class-Incremental Learning without Memory
    Slim, Habib
    Belouadah, Eden
    Popescu, Adrian
    Onchis, Darian
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 3311 - 3320
  • [50] Class-Incremental Unsupervised Domain Adaptation via Pseudo-Label Distillation
    Wei, Kun
    Yang, Xu
    Xu, Zhe
    Deng, Cheng
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 1188 - 1198