A lightweight crack segmentation network based on knowledge distillation

被引:21
|
作者
Wang, Wenjun [1 ]
Su, Chao [1 ]
Han, Guohui [1 ]
Zhang, Heng [1 ,2 ]
机构
[1] Hohai Univ, Coll Water Conservancy & Hydropower Engn, Nanjing 210024, Peoples R China
[2] Changjiang Inst Survey Planning Design & Res, Wuhan 430010, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Crack segmentation; Deep learning; Knowledge distillation; Lightweight network; Channel -wise distillation;
D O I
10.1016/j.jobe.2023.107200
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
This paper presents a novel approach for addressing the challenges of large parameter volumes and high computational complexity in existing deep learning models for crack detection. This method involves training a student model using a pretrained teacher model to guide the learning process. The novelty of the method is the use of channel-wise knowledge distillation to normalize activation maps between the teacher and student models, followed by the minimization of the asymmetric Kullback-Leibler divergence to achieve optimal model performance. By focusing on imitating regions with prominent activation values, the student model achieves accurate crack localization. Test results show that the method improves crack segmentation, based on improvements in the F1_score and intersection over union by 2.17% and 3.55%, respectively, and outperforms other compared knowledge distillation methods. A lightweight crack segmentation model that ensures accuracy and efficiency is established in this study, which can provide an efficient solution for crack segmentation in real-world scenarios.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] A novel high-precision lightweight concrete bridge crack image recognition method based on knowledge distillation
    Fan, Qian
    Xie, Jun
    Xia, Zhanghua
    Tan, Yunhua
    Zhu, Sanfan
    Wang, Jiayu
    STRUCTURAL HEALTH MONITORING-AN INTERNATIONAL JOURNAL, 2024,
  • [32] Lightweight convolutional neural network with knowledge distillation for cervical cells classification
    Chen, Wen
    Gao, Liang
    Li, Xinyu
    Shen, Weiming
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [33] Adaptive lightweight network construction method for Self-Knowledge Distillation
    Lu, Siyuan
    Zeng, Weiliang
    Li, Xueshi
    Ou, Jiajun
    NEUROCOMPUTING, 2025, 624
  • [34] Reconstructed Graph Neural Network With Knowledge Distillation for Lightweight Anomaly Detection
    Zhou, Xiaokang
    Wu, Jiayi
    Liang, Wei
    Wang, Kevin I-Kai
    Yan, Zheng
    Yang, Laurence T.
    Jin, Qun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11817 - 11828
  • [35] FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation
    Yuan, Wenhao
    Lu, Xiaoyan
    Zhang, Rongfen
    Liu, Yuhong
    ENTROPY, 2023, 25 (01)
  • [36] An Efficient and Lightweight Approach for Intrusion Detection based on Knowledge Distillation
    Zhao, Ruijie
    Chen, Yu
    Wang, Yijun
    Shi, Yong
    Xue, Zhi
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [37] Lightweight Tunnel Defect Detection Algorithm Based on Knowledge Distillation
    Zhu, Anfu
    Wang, Bin
    Xie, Jiaxiao
    Ma, Congxiao
    ELECTRONICS, 2023, 12 (15)
  • [38] Lightweight intrusion detection model based on CNN and knowledge distillation
    Wang, Long-Hui
    Dai, Qi
    Du, Tony
    Chen, Li-fang
    APPLIED SOFT COMPUTING, 2024, 165
  • [39] A Lightweight Android Malware Detection Framework Based on Knowledge Distillation
    Zhi, Yongbo
    Xi, Ning
    Liu, Yuanqing
    Hui, Honglei
    NETWORK AND SYSTEM SECURITY, NSS 2021, 2021, 13041 : 116 - 130
  • [40] Lightweight image dehazing networks based on soft knowledge distillation
    Tran, Le-Anh
    Park, Dong-Chul
    VISUAL COMPUTER, 2024, : 4047 - 4066