Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation

被引:0
|
作者
Baek, Donghyeon [1 ]
Oh, Youngmin [1 ]
Lee, Sanghoon [1 ]
Lee, Junghyup [1 ]
Ham, Bumsub [1 ,2 ]
机构
[1] Yonsei Univ, Seoul, South Korea
[2] Korea Inst Sci & Technol KIST, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental semantic segmentation (CISS) labels each pixel of an image with a corresponding object/stuff class continually. To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms. They quantify how likely an input belongs to a particular class or not, providing a clue for a reasoning process of a model. The KD technique, in this context, preserves the sum of two terms (i.e., a class logit), suggesting that each could be changed and thus the KD does not imitate the reasoning process. To impose constraints on each term explicitly, we propose a new decomposed knowledge distillation (DKD) technique, improving the rigidity of a model and addressing the forgetting problem more effectively. We also introduce a novel initialization method to train new classifiers for novel classes. In CISS, the number of negative training samples for novel classes is not sufficient to discriminate old classes. To mitigate this, we propose to transfer knowledge of negatives to the classifiers successively using an auxiliary classifier, boosting the performance significantly. Experimental results on standard CISS benchmarks demonstrate the effectiveness of our framework.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation
    Minh Hieu Phan
    The-Anh Ta
    Son Lam Phung
    Long Tran-Thanh
    Bouzerdoum, Abdesselam
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16845 - 16854
  • [32] Class-Incremental Learning via Knowledge Amalgamation
    de Carvalho, Marcus
    Pratama, Mahardhika
    Zhang, Jie
    Sun, Yajuan
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 36 - 50
  • [33] Class-incremental learning for multi-organ segmentation
    Chen, Junyu
    Frey, Eric
    Du, Yong
    JOURNAL OF NUCLEAR MEDICINE, 2022, 63
  • [34] Class-Incremental Learning using Diffusion Model for Distillation and Replay
    Jodelet, Quentin
    Liu, Xin
    Phua, Yin Jun
    Murata, Tsuyoshi
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3417 - 3425
  • [35] MiCro: Modeling Cross-Image Semantic Relationship Dependencies for Class-Incremental Semantic Segmentation in Remote Sensing Images
    Rong, Xuee
    Wang, Peijin
    Diao, Wenhui
    Yang, Yiran
    Yin, Wenxin
    Zeng, Xuan
    Wang, Hongqi
    Sun, Xian
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [36] Sequence-Level Knowledge Distillation for Class-Incremental End-to-End Spoken Language Understanding
    Cappellazzo, Umberto
    Yang, Muqiao
    Falavigna, Daniele
    Brutti, Alessio
    INTERSPEECH 2023, 2023, : 2953 - 2957
  • [37] Geometry and Uncertainty-Aware 3D Point Cloud Class-Incremental Semantic Segmentation
    Yang, Yuwei
    Hayat, Munawar
    Jin, Zhao
    Ren, Chao
    Lei, Yinjie
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 21759 - 21768
  • [38] Attribution-aware Weight Transfer: A Warm-Start Initialization for Class-Incremental Semantic Segmentation
    Goswami, Dipam
    Schuster, Rene
    van de Weijer, Joost
    Stricker, Didier
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 3194 - 3203
  • [39] Structured Knowledge Distillation for Semantic Segmentation
    Liu, Yifan
    Chen, Ke
    Liu, Chris
    Qin, Zengchang
    Luo, Zhenbo
    Wang, Jingdong
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2599 - 2608