A lightweight crack segmentation network based on knowledge distillation

被引:21
|
作者
Wang, Wenjun [1 ]
Su, Chao [1 ]
Han, Guohui [1 ]
Zhang, Heng [1 ,2 ]
机构
[1] Hohai Univ, Coll Water Conservancy & Hydropower Engn, Nanjing 210024, Peoples R China
[2] Changjiang Inst Survey Planning Design & Res, Wuhan 430010, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Crack segmentation; Deep learning; Knowledge distillation; Lightweight network; Channel -wise distillation;
D O I
10.1016/j.jobe.2023.107200
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
This paper presents a novel approach for addressing the challenges of large parameter volumes and high computational complexity in existing deep learning models for crack detection. This method involves training a student model using a pretrained teacher model to guide the learning process. The novelty of the method is the use of channel-wise knowledge distillation to normalize activation maps between the teacher and student models, followed by the minimization of the asymmetric Kullback-Leibler divergence to achieve optimal model performance. By focusing on imitating regions with prominent activation values, the student model achieves accurate crack localization. Test results show that the method improves crack segmentation, based on improvements in the F1_score and intersection over union by 2.17% and 3.55%, respectively, and outperforms other compared knowledge distillation methods. A lightweight crack segmentation model that ensures accuracy and efficiency is established in this study, which can provide an efficient solution for crack segmentation in real-world scenarios.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Efficient Medical Image Segmentation Based on Knowledge Distillation
    Qin, Dian
    Bu, Jia-Jun
    Liu, Zhe
    Shen, Xin
    Zhou, Sheng
    Gu, Jing-Jun
    Wang, Zhi-Hua
    Wu, Lei
    Dai, Hui-Fen
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2021, 40 (12) : 3820 - 3831
  • [42] Lightweight remote sensing scene classification based on knowledge distillation
    Zhang, Chong-Yang
    Wang, Bin
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 2024, 43 (05) : 684 - 695
  • [43] Efficient Multi-Organ Segmentation From 3D Abdominal CT Images With Lightweight Network and Knowledge Distillation
    Zhao, Qianfei
    Zhong, Lanfeng
    Xiao, Jianghong
    Zhang, Jingbo
    Chen, Yinan
    Liao, Wenjun
    Zhang, Shaoting
    Wang, Guotai
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (09) : 2513 - 2523
  • [44] TGNet: A Lightweight Infrared Thermal Image Gesture Recognition Network Based on Knowledge Distillation and Model Pruning
    Chen, L.
    Sun, Q.
    Xu, Z.
    Liao, Y.
    2024 CROSS STRAIT RADIO SCIENCE AND WIRELESS TECHNOLOGY CONFERENCE, CSRSWTC 2024, 2024, : 96 - 98
  • [45] KD-SegNet: Efficient Semantic Segmentation Network with Knowledge Distillation Based on Monocular Camera
    Dang, Thai-Viet
    Bui, Nhu-Nghia
    Tan, Phan Xuan
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02): : 2001 - 2026
  • [46] Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
    Jeong, Yongseop
    Park, Jinsun
    Cho, Donghyeon
    Hwang, Yoonjin
    Choi, Seibum B.
    Kweon, In So
    SENSORS, 2022, 22 (19)
  • [47] Spatial-temporal knowledge distillation for lightweight network traffic anomaly detection
    Wang, Xintong
    Wang, Zixuan
    Wang, Enliang
    Sun, Zhixin
    COMPUTERS & SECURITY, 2024, 137
  • [48] A Lightweight Convolution Network with Self-Knowledge Distillation for Hyperspectral Image Classification
    Xu, Hao
    Cao, Guo
    Deng, Lindiao
    Ding, Lanwei
    Xu, Ling
    Pan, Qikun
    Shang, Yanfeng
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [49] Lightweight Edge-side Fault Diagnosis Based on Knowledge Distillation
    Shang, Yingjun
    Feng, Tao
    Huo, Yonghua
    Duan, Yongcun
    Long, Yuhan
    2022 IEEE 14TH INTERNATIONAL CONFERENCE ON ADVANCED INFOCOMM TECHNOLOGY (ICAIT 2022), 2022, : 348 - 353
  • [50] Lightweight defect detection algorithm of tunnel lining based on knowledge distillation
    Zhu, Anfu
    Xie, Jiaxiao
    Wang, Bin
    Guo, Heng
    Guo, Zilong
    Wang, Jie
    Xu, Lei
    Zhu, Sixin
    Yang, Zhanping
    SCIENTIFIC REPORTS, 2024, 14 (01):