Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation

被引:26
|
作者
Minh Hieu Phan [1 ]
The-Anh Ta [2 ]
Son Lam Phung [1 ,3 ]
Long Tran-Thanh [4 ]
Bouzerdoum, Abdesselam [1 ,5 ]
机构
[1] Univ Wollongong, Wollongong, NSW, Australia
[2] FPT Software, AIC, Hanoi, Vietnam
[3] VinAI Res, Hanoi, Vietnam
[4] Univ Warwick, Coventry, W Midlands, England
[5] Hamad Bin Khalifa Univ, Ar Rayyan, Qatar
关键词
D O I
10.1109/CVPR52688.2022.01636
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models are known to suffer from the problem of catastrophic forgetting when they incrementally learn new classes. Continual learning for semantic segmentation (CSS) is an emerging field in computer vision. We identify a problem in CSS: A model tends to be confused between old and new classes that are visually similar, which makes it forget the old ones. To address this gap, we propose REMINDER - a new CSS framework and a novel class similarity knowledge distillation (CSW-KD) method. Our CSW-KD method distills the knowledge of a previous model on old classes that are similar to the new one. This provides two main benefits: (i) selectively revising old classes that are more likely to be forgotten, and (ii) better learning new classes by relating them with the previously seen classes. Extensive experiments on Pascal-VOC 2012 and ADE20k datasets show that our approach outperforms state-of-the-art methods on standard CSS settings by up to 7.07% and 8.49%, respectively.
引用
收藏
页码:16845 / 16854
页数:10
相关论文
共 50 条
  • [41] Continual Variational Autoencoder via Continual Generative Knowledge Distillation
    Ye, Fei
    Bors, Adrian G.
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 10918 - 10926
  • [42] Dual attention-guided distillation for class incremental semantic segmentation
    Xu, Pengju
    Wang, Yan
    Wang, Bingye
    Zhao, Haiying
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [43] Semantic Segmentation Optimization Algorithm Based on Knowledge Distillation and Model Pruning
    Yao, Weiwei
    Zhang, Jie
    Li, Chen
    Li, Shiyun
    He, Li
    Zhang, Bo
    2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD 2019), 2019, : 261 - 265
  • [44] Region-aware mutual relational knowledge distillation for semantic segmentation
    Zheng, Haowen
    Lin, Xuxin
    Liang, Hailun
    Zhou, Benjia
    Liang, Yanyan
    PATTERN RECOGNITION, 2025, 161
  • [45] Real-time semantic segmentation via sequential knowledge distillation
    Wu, Jipeng
    Ji, Rongrong
    Liu, Jianzhuang
    Xu, Mingliang
    Zheng, Jiawen
    Shao, Ling
    Tian, Qi
    NEUROCOMPUTING, 2021, 439 : 134 - 145
  • [46] FRKDNet:feature refine semantic segmentation network based on knowledge distillation
    Jiang Shi-yi
    Xu Yang
    Li Dan-yang
    Fan Run-ze
    CHINESE JOURNAL OF LIQUID CRYSTALS AND DISPLAYS, 2023, 38 (11) : 1590 - 1599
  • [47] NLKD: USING COARSE ANNOTATIONS FOR SEMANTIC SEGMENTATION BASED ON KNOWLEDGE DISTILLATION
    Liang, Dong
    Du, Yun
    Sun, Han
    Zhang, Liyan
    Liu, Ningzhong
    Wei, Mingqiang
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2335 - 2339
  • [48] Multi-to-Single Knowledge Distillation for Point Cloud Semantic Segmentation
    Qiu, Shoumeng
    Jiang, Feng
    Zhang, Haiqiang
    Xue, Xiangyang
    Pu, Jian
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 9303 - 9309
  • [49] Ensemble-based Knowledge Distillation for Semantic Segmentation in Autonomous Driving
    Dragan, Iulia
    Groza, Adrian
    2022 IEEE 18TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, ICCP, 2022, : 295 - 302
  • [50] Hierarchical Region-level Decoupling Knowledge Distillation for semantic segmentation
    Yu, Xiangchun
    Liu, Huofa
    Zhang, Dingwen
    Wu, Jianqing
    Zheng, Jian
    MULTIMEDIA SYSTEMS, 2025, 31 (01)