Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation

被引:26
|
作者
Minh Hieu Phan [1 ]
The-Anh Ta [2 ]
Son Lam Phung [1 ,3 ]
Long Tran-Thanh [4 ]
Bouzerdoum, Abdesselam [1 ,5 ]
机构
[1] Univ Wollongong, Wollongong, NSW, Australia
[2] FPT Software, AIC, Hanoi, Vietnam
[3] VinAI Res, Hanoi, Vietnam
[4] Univ Warwick, Coventry, W Midlands, England
[5] Hamad Bin Khalifa Univ, Ar Rayyan, Qatar
关键词
D O I
10.1109/CVPR52688.2022.01636
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models are known to suffer from the problem of catastrophic forgetting when they incrementally learn new classes. Continual learning for semantic segmentation (CSS) is an emerging field in computer vision. We identify a problem in CSS: A model tends to be confused between old and new classes that are visually similar, which makes it forget the old ones. To address this gap, we propose REMINDER - a new CSS framework and a novel class similarity knowledge distillation (CSW-KD) method. Our CSW-KD method distills the knowledge of a previous model on old classes that are similar to the new one. This provides two main benefits: (i) selectively revising old classes that are more likely to be forgotten, and (ii) better learning new classes by relating them with the previously seen classes. Extensive experiments on Pascal-VOC 2012 and ADE20k datasets show that our approach outperforms state-of-the-art methods on standard CSS settings by up to 7.07% and 8.49%, respectively.
引用
收藏
页码:16845 / 16854
页数:10
相关论文
共 50 条
  • [21] TIKP: Text-to-Image Knowledge Preservation for Continual Semantic Segmentation
    Yu, Zhidong
    Yang, Wei
    Xie, Xike
    Shi, Zhenbo
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16596 - 16604
  • [22] Multi-instance semantic similarity transferring for knowledge distillation
    Zhao, Haoran
    Sun, Xin
    Dong, Junyu
    Yu, Hui
    Wang, Gaige
    KNOWLEDGE-BASED SYSTEMS, 2022, 256
  • [23] Bilateral Knowledge Distillation for Unsupervised Domain Adaptation of Semantic Segmentation
    Wang, Yunnan
    Li, Jianxun
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 10177 - 10184
  • [24] Multi-view knowledge distillation for efficient semantic segmentation
    Wang, Chen
    Zhong, Jiang
    Dai, Qizhu
    Qi, Yafei
    Shi, Fengyuan
    Fang, Bin
    Li, Xue
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2023, 20 (02)
  • [25] Robust Semantic Segmentation With Multi-Teacher Knowledge Distillation
    Amirkhani, Abdollah
    Khosravian, Amir
    Masih-Tehrani, Masoud
    Kashiani, Hossein
    IEEE ACCESS, 2021, 9 : 119049 - 119066
  • [26] Multi-view knowledge distillation for efficient semantic segmentation
    Chen Wang
    Jiang Zhong
    Qizhu Dai
    Yafei Qi
    Fengyuan Shi
    Bin Fang
    Xue Li
    Journal of Real-Time Image Processing, 2023, 20
  • [27] Cross-Image Relational Knowledge Distillation for Semantic Segmentation
    Yang, Chuanguang
    Zhou, Helong
    An, Zhulin
    Jiang, Xue
    Xu, Yongjun
    Zhang, Qian
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12309 - 12318
  • [28] Channel-spatial knowledge distillation for efficient semantic segmentation
    Karine, Ayoub
    Napoleon, Thibault
    Jridi, Maher
    PATTERN RECOGNITION LETTERS, 2024, 180 : 48 - 54
  • [29] Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation
    Hou, Yuenan
    Zhu, Xinge
    Ma, Yuexin
    Loy, Chen Change
    Li, Yikang
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 8469 - 8478
  • [30] Inter-image Discrepancy Knowledge Distillation for Semantic Segmentation
    Chen, Kaijie
    Gou, Jianping
    Li, Lin
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT III, 2024, 14427 : 273 - 284