Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation

被引:26
|
作者
Minh Hieu Phan [1 ]
The-Anh Ta [2 ]
Son Lam Phung [1 ,3 ]
Long Tran-Thanh [4 ]
Bouzerdoum, Abdesselam [1 ,5 ]
机构
[1] Univ Wollongong, Wollongong, NSW, Australia
[2] FPT Software, AIC, Hanoi, Vietnam
[3] VinAI Res, Hanoi, Vietnam
[4] Univ Warwick, Coventry, W Midlands, England
[5] Hamad Bin Khalifa Univ, Ar Rayyan, Qatar
关键词
D O I
10.1109/CVPR52688.2022.01636
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models are known to suffer from the problem of catastrophic forgetting when they incrementally learn new classes. Continual learning for semantic segmentation (CSS) is an emerging field in computer vision. We identify a problem in CSS: A model tends to be confused between old and new classes that are visually similar, which makes it forget the old ones. To address this gap, we propose REMINDER - a new CSS framework and a novel class similarity knowledge distillation (CSW-KD) method. Our CSW-KD method distills the knowledge of a previous model on old classes that are similar to the new one. This provides two main benefits: (i) selectively revising old classes that are more likely to be forgotten, and (ii) better learning new classes by relating them with the previously seen classes. Extensive experiments on Pascal-VOC 2012 and ADE20k datasets show that our approach outperforms state-of-the-art methods on standard CSS settings by up to 7.07% and 8.49%, respectively.
引用
收藏
页码:16845 / 16854
页数:10
相关论文
共 50 条
  • [31] Semi-supervised Semantic Segmentation with Mutual Knowledge Distillation
    Yuan, Jianlong
    Ge, Jinchao
    Wang, Zhibin
    Liu, Yifan
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5436 - 5444
  • [32] Knowledge Distillation for Efficient Panoptic Semantic Segmentation: applied to agriculture
    Li, Maohui
    Hasltead, Michael
    McCool, Chris
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 4204 - 4211
  • [33] FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation
    Yuan, Wenhao
    Lu, Xiaoyan
    Zhang, Rongfen
    Liu, Yuhong
    ENTROPY, 2023, 25 (01)
  • [34] Semantic Segmentation of Medical Images Based on Knowledge Distillation Algorithm
    Liu, Hanqing
    Li, Fang
    Yang, Jingyi
    Wang, Xiaotian
    Han, Junling
    Wei, Jin
    Kang, Xiaodong
    12TH ASIAN-PACIFIC CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING, VOL 1, APCMBE 2023, 2024, 103 : 180 - 196
  • [35] Domain Adaptive Knowledge Distillation for Driving Scene Semantic Segmentation
    Kothandaraman, Divya
    Nambiar, Athira
    Mittal, Anurag
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WORKSHOPS (WACVW 2021), 2021, : 134 - 143
  • [36] Label-Guided Knowledge Distillation for Continual Semantic Segmentation on 2D Images and 3D Point Clouds
    Yang, Ze
    Li, Ruibo
    Ling, Evan
    Zhang, Chi
    Wang, Yiming
    Huang, Dezhao
    Ma, Keng Teck
    Hur, Minhoe
    Lin, Guosheng
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 18555 - 18566
  • [37] Effects of Architectures on Continual Semantic Segmentation
    Kalb, Tobias
    Ahuja, Niket
    Zhou, Jingxing
    Beyerer, Juergen
    2023 IEEE INTELLIGENT VEHICLES SYMPOSIUM, IV, 2023,
  • [38] Open-World Semantic Segmentation Including Class Similarity
    Sodano, Matteo
    Magistri, Federico
    Nunes, Lucas
    Behley, Jens
    Stachniss, Cyrill
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2024, 2024, : 3184 - 3194
  • [39] Continual Learning With Knowledge Distillation: A Survey
    Li, Songze
    Su, Tonghua
    Zhang, Xuyao
    Wang, Zhongjie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [40] Preparing the Future for Continual Semantic Segmentation
    Lin, Zihan
    Wang, Zilei
    Zhang, Yixin
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11876 - 11886