Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation

被引:0
|
作者
Baek, Donghyeon [1 ]
Oh, Youngmin [1 ]
Lee, Sanghoon [1 ]
Lee, Junghyup [1 ]
Ham, Bumsub [1 ,2 ]
机构
[1] Yonsei Univ, Seoul, South Korea
[2] Korea Inst Sci & Technol KIST, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental semantic segmentation (CISS) labels each pixel of an image with a corresponding object/stuff class continually. To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms. They quantify how likely an input belongs to a particular class or not, providing a clue for a reasoning process of a model. The KD technique, in this context, preserves the sum of two terms (i.e., a class logit), suggesting that each could be changed and thus the KD does not imitate the reasoning process. To impose constraints on each term explicitly, we propose a new decomposed knowledge distillation (DKD) technique, improving the rigidity of a model and addressing the forgetting problem more effectively. We also introduce a novel initialization method to train new classifiers for novel classes. In CISS, the number of negative training samples for novel classes is not sufficient to discriminate old classes. To mitigate this, we propose to transfer knowledge of negatives to the classifiers successively using an auxiliary classifier, boosting the performance significantly. Experimental results on standard CISS benchmarks demonstrate the effectiveness of our framework.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Incrementer: Transformer for Class-Incremental Semantic Segmentation with Knowledge Distillation Focusing on Old Class
    Shang, Chao
    Li, Hongliang
    Meng, Fanman
    Wu, Qingbo
    Qiu, Heqian
    Wang, Lanxiao
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7214 - 7224
  • [2] Class-Incremental Learning for Semantic Segmentation - A study
    Holmquist, Karl
    Klasen, Lena
    Felsberg, Michael
    33RD WORKSHOP OF THE SWEDISH ARTIFICIAL INTELLIGENCE SOCIETY (SAIS 2021), 2021, : 25 - 28
  • [3] Class-Incremental Learning for Semantic Segmentation in Aerial Imagery via Distillation in All Aspects
    Shan, Lianlei
    Wang, Weiqiang
    Lv, Ke
    Luo, Bin
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [4] Semantic Knowledge Guided Class-Incremental Learning
    Wang, Shaokun
    Shi, Weiwei
    Dong, Songlin
    Gao, Xinyuan
    Song, Xiang
    Gong, Yihong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (10) : 5921 - 5931
  • [5] Mitigating Background Shift in Class-Incremental Semantic Segmentation
    Park, Gilhan
    Moon, WonJun
    Lee, SuBeen
    Kim, Tae-Young
    Heo, Jae-Pil
    COMPUTER VISION - ECCV 2024, PT L, 2025, 15108 : 71 - 88
  • [6] Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning
    Cheraghian, Ali
    Rahman, Shafin
    Fang, Pengfei
    Roy, Soumava Kumar
    Petersson, Lars
    Harandi, Mehrtash
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 2534 - 2543
  • [7] Layer-Specific Knowledge Distillation for Class Incremental Semantic Segmentation
    Wang, Qilong
    Wu, Yiwen
    Yang, Liu
    Zuo, Wangmeng
    Hu, Qinghua
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 1977 - 1989
  • [8] Self-Training for Class-Incremental Semantic Segmentation
    Yu, Lu
    Liu, Xialei
    van de Weijer, Joost
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 9116 - 9127
  • [9] Causes of Catastrophic Forgetting in Class-Incremental Semantic Segmentation
    Kalb, Tobias
    Beyerer, Juergen
    COMPUTER VISION - ACCV 2022, PT VII, 2023, 13847 : 361 - 377
  • [10] Knowledge distillation for incremental learning in semantic segmentation
    Michieli, Umberto
    Zanuttigh, Pietro
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2021, 205