Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation

被引:0
|
作者
Baek, Donghyeon [1 ]
Oh, Youngmin [1 ]
Lee, Sanghoon [1 ]
Lee, Junghyup [1 ]
Ham, Bumsub [1 ,2 ]
机构
[1] Yonsei Univ, Seoul, South Korea
[2] Korea Inst Sci & Technol KIST, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental semantic segmentation (CISS) labels each pixel of an image with a corresponding object/stuff class continually. To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms. They quantify how likely an input belongs to a particular class or not, providing a clue for a reasoning process of a model. The KD technique, in this context, preserves the sum of two terms (i.e., a class logit), suggesting that each could be changed and thus the KD does not imitate the reasoning process. To impose constraints on each term explicitly, we propose a new decomposed knowledge distillation (DKD) technique, improving the rigidity of a model and addressing the forgetting problem more effectively. We also introduce a novel initialization method to train new classifiers for novel classes. In CISS, the number of negative training samples for novel classes is not sufficient to discriminate old classes. To mitigate this, we propose to transfer knowledge of negatives to the classifiers successively using an auxiliary classifier, boosting the performance significantly. Experimental results on standard CISS benchmarks demonstrate the effectiveness of our framework.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Class-Incremental Semantic Segmentation of Aerial Images via Pixel-Level Feature Generation and Task-Wise Distillation
    Shan, Lianlei
    Wang, Weiqiang
    Lv, Ke
    Luo, Bin
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [22] Semantic Drift Compensation for Class-Incremental Learning
    Yu, Lu
    Twardowski, Bartlomiej
    Liu, Xialei
    Herranz, Luis
    Wang, Kai
    Cheng, Yongmei
    Jui, Shangling
    van de Weijer, Joost
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 6980 - 6989
  • [23] Dual attention-guided distillation for class incremental semantic segmentation
    Xu, Pengju
    Wang, Yan
    Wang, Bingye
    Zhao, Haiying
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [24] SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning
    Cha, Sungmin
    Kim, Beomyoung
    Yoo, YoungJoon
    Moon, Taesup
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [25] Class-Incremental Learning Network for Small Objects Enhancing of Semantic Segmentation in Aerial Imagery
    Li, Junxi
    Sun, Xian
    Diao, Wenhui
    Wang, Peijin
    Feng, Yingchao
    Lu, Xiaonan
    Xu, Guangluan
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [26] Class-Incremental Learning on Video-Based Action Recognition by Distillation of Various Knowledge
    Maraghi, Vali Ollah
    Faez, Karim
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [27] Background Adaptation with Residual Modeling for Exemplar-Free Class-Incremental Semantic Segmentation
    Zhang, Anqi
    Gao, Guangyu
    COMPUTER VISION - ECCV 2024, PT LII, 2025, 15110 : 166 - 183
  • [28] Historical Information-Guided Class-Incremental Semantic Segmentation in Remote Sensing Images
    Rong, Xuee
    Sun, Xian
    Diao, Wenhui
    Wang, Peijin
    Yuan, Zhiqiang
    Wang, Hongqi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [29] Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning
    Shi, Yanyan
    Shi, Dianxi
    Qiao, Ziteng
    Wang, Zhen
    Zhang, Yi
    Yang, Shaowu
    Qiu, Chunping
    NEURAL NETWORKS, 2023, 164 : 617 - 630
  • [30] Class-Incremental Exemplar Compression for Class-Incremental Learning
    Luo, Zilin
    Liu, Yaoyao
    Schiele, Bernt
    Sun, Qianru
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11371 - 11380