FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation

被引:2
|
作者
Yuan, Wenhao [1 ]
Lu, Xiaoyan [1 ]
Zhang, Rongfen [1 ]
Liu, Yuhong [1 ]
机构
[1] Guizhou Univ, Coll Big Data & Informat Engn, Guiyang 550025, Peoples R China
关键词
knowledge distillation; feature condensation; prediction information entropy; feature soft enhancement; semantic segmentation; IMAGES;
D O I
10.3390/e25010125
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
As a popular research subject in the field of computer vision, knowledge distillation (KD) is widely used in semantic segmentation (SS). However, based on the learning paradigm of the teacher-student model, the poor quality of teacher network feature knowledge still hinders the development of KD technology. In this paper, we investigate the output features of the teacher-student network and propose a feature condensation-based KD network (FCKDNet), which reduces pseudo-knowledge transfer in the teacher-student network. First, combined with the pixel information entropy calculation rule, we design a feature condensation method to separate the foreground feature knowledge from the background noise of the teacher network outputs. Then, the obtained feature condensation matrix is applied to the original outputs of the teacher and student networks to improve the feature representation capability. In addition, after performing feature condensation on the teacher network, we propose a soft enhancement method of features based on spatial and channel dimensions to improve the dependency of pixels in the feature maps. Finally, we divide the outputs of the teacher network into spatial condensation features and channel condensation features and perform distillation loss calculation with the student network separately to assist the student network to converge faster. Extensive experiments on the public datasets Pascal VOC and Cityscapes demonstrate that our proposed method improves the baseline by 3.16% and 2.98% in terms of mAcc, and 2.03% and 2.30% in terms of mIoU, respectively, and has better segmentation performance and robustness than the mainstream methods.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Semantic segmentation feature fusion network based on transformer
    Li, Tianping
    Cui, Zhaotong
    Zhang, Hua
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [42] Semantic Segmentation Optimization Algorithm Based on Knowledge Distillation and Model Pruning
    Yao, Weiwei
    Zhang, Jie
    Li, Chen
    Li, Shiyun
    He, Li
    Zhang, Bo
    2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD 2019), 2019, : 261 - 265
  • [43] Enhanced-feature pyramid network for semantic segmentation
    Quyen, Van Toan
    Lee, Jong Hyuk
    Kim, Min Young
    2023 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN INFORMATION AND COMMUNICATION, ICAIIC, 2023, : 782 - 787
  • [44] Weld Feature Extraction Based on Semantic Segmentation Network
    Wang, Bin
    Li, Fengshun
    Lu, Rongjian
    Ni, Xiaoyu
    Zhu, Wenhan
    SENSORS, 2022, 22 (11)
  • [45] Region-aware mutual relational knowledge distillation for semantic segmentation
    Zheng, Haowen
    Lin, Xuxin
    Liang, Hailun
    Zhou, Benjia
    Liang, Yanyan
    PATTERN RECOGNITION, 2025, 161
  • [46] ACFNet: Attentional Class Feature Network for Semantic Segmentation
    Zhang, Fan
    Chen, Yanqin
    Li, Zhihang
    Hong, Zhibin
    Liu, Jingtuo
    Ma, Feifei
    Han, Junyu
    Ding, Errui
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6797 - 6806
  • [47] Layer-Specific Knowledge Distillation for Class Incremental Semantic Segmentation
    Wang, Qilong
    Wu, Yiwen
    Yang, Liu
    Zuo, Wangmeng
    Hu, Qinghua
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 1977 - 1989
  • [48] Real-time semantic segmentation via sequential knowledge distillation
    Wu, Jipeng
    Ji, Rongrong
    Liu, Jianzhuang
    Xu, Mingliang
    Zheng, Jiawen
    Shao, Ling
    Tian, Qi
    NEUROCOMPUTING, 2021, 439 : 134 - 145
  • [49] NLKD: USING COARSE ANNOTATIONS FOR SEMANTIC SEGMENTATION BASED ON KNOWLEDGE DISTILLATION
    Liang, Dong
    Du, Yun
    Sun, Han
    Zhang, Liyan
    Liu, Ningzhong
    Wei, Mingqiang
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2335 - 2339
  • [50] Multi-to-Single Knowledge Distillation for Point Cloud Semantic Segmentation
    Qiu, Shoumeng
    Jiang, Feng
    Zhang, Haiqiang
    Xue, Xiangyang
    Pu, Jian
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 9303 - 9309