Domain adaptation and knowledge distillation for lightweight pavement crack detection

被引:0
|
作者
Xiao, Tianhao [1 ]
Pang, Rong [3 ,4 ]
Liu, Huijun [1 ]
Yang, Chunhua [1 ]
Li, Ao [2 ]
Niu, Chenxu [1 ]
Ruan, Zhimin [5 ]
Xu, Ling [2 ]
Ge, Yongxin [2 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[2] Chongqing Univ, Sch Big Data & Software Engn, Chongqing 401331, Peoples R China
[3] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 611756, Peoples R China
[4] China Merchants Chongqing Rd Engn Inspect Ctr Co L, Chongqing 400067, Peoples R China
[5] China Merchants Chongqing Commun Technol Res & Des, Chongqing 400067, Peoples R China
基金
中国国家自然科学基金;
关键词
Pavement crack detection; Knowledge distillation; Lightweight model; Domain adaptation;
D O I
10.1016/j.eswa.2024.125734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pavement crack detection is crucial for maintaining safe driving conditions; thus, the timely and accurate detection of cracks is of considerable importance. However, although deep neural networks (DNNs) have performed well in pavement crack detection, their dependence on large-scale labeled datasets, excessive model parameters, and high computational costs limit their application at the edge or on mobile devices. The conventional approaches concentrate on domain adaptation to leverage unlabeled data but overlook the domain shift issue, which can lead to performance degradation and is noticeable in lightweight models. Therefore, we propose a lightweight deep domain-adaptive crack detection network (L-DDACDN) to address these issues. Specifically, a novel distillation loss method that incorporates domain information, which facilitates the transfer of knowledge from a teacher model to a student model, is introduced. Additionally, L-DDACDN imitates the feature responses of a teacher model near the object anchor locations, ensuring that the student model effectively learns crucial features, thus addressing the domain shift issue and maintaining performance in lightweight models. Experimental results show that compared with the deep domain-adaptive crack detection network (DDACDN) trained with a large-scale pre-trained model, L-DDACDN has an average loss of only 3.5% and 3.9% in F1-scores and Accuracy, respectively. In contrast, the model parameters and FLOPs are reduced by approximately 92%. Additionally, compared to the YOLOv5, L-DDACDN demonstrates a notable improvement in the F1-scores and Accuracy on the CQU-BPDD dataset, revealing an average increase of 5% and 1.8% in F1-scores and Accuracy, respectively.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Iterative knowledge distillation and pruning for model compression in unsupervised domain adaptation
    Wang, Zhiyuan
    Shi, Long
    Mei, Zhen
    Zhao, Xiang
    Wang, Zhe
    Li, Jun
    PATTERN RECOGNITION, 2025, 164
  • [22] Lightweight defect detection algorithm of tunnel lining based on knowledge distillation
    Zhu, Anfu
    Xie, Jiaxiao
    Wang, Bin
    Guo, Heng
    Guo, Zilong
    Wang, Jie
    Xu, Lei
    Zhu, Sixin
    Yang, Zhanping
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [23] A Lightweight Pipeline Edge Detection Model Based on Heterogeneous Knowledge Distillation
    Zhu, Chengyuan
    Pu, Yanyun
    Lyu, Zhuoling
    Wu, Aonan
    Yang, Kaixiang
    Yang, Qinmin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2024, 71 (12) : 5059 - 5063
  • [24] A Federated Domain Adaptation Algorithm Based on Knowledge Distillation and Contrastive Learning
    HUANG Fang
    FANG Zhijun
    SHI Zhicai
    ZHUANG Lehui
    LI Xingchen
    HUANG Bo
    WuhanUniversityJournalofNaturalSciences, 2022, 27 (06) : 499 - 507
  • [25] Research on a lightweight electronic component detection method based on knowledge distillation
    Xia, Zilin
    Gu, Jinan
    Wang, Wenbo
    Huang, Zedong
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (12) : 20971 - 20994
  • [26] KDSMALL: A lightweight small object detection algorithm based on knowledge distillation
    Zhou, Wen
    Wang, Xiaodon
    Fan, Yusheng
    Yang, Yishuai
    Wen, Yihan
    Li, Yixuan
    Xu, Yicheng
    Lin, Zhengyuan
    Chen, Langlang
    Yao, Shizhou
    Zequn, Liu
    Wang, Jianqing
    COMPUTER COMMUNICATIONS, 2024, 219 : 271 - 281
  • [27] Reconstructed Graph Neural Network With Knowledge Distillation for Lightweight Anomaly Detection
    Zhou, Xiaokang
    Wu, Jiayi
    Liang, Wei
    Wang, Kevin I-Kai
    Yan, Zheng
    Yang, Laurence T.
    Jin, Qun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11817 - 11828
  • [28] CXR Segmentation by AdaIN-Based Domain Adaptation and Knowledge Distillation
    Oh, Yujin
    Ye, Jong Chul
    COMPUTER VISION, ECCV 2022, PT XXI, 2022, 13681 : 627 - 643
  • [29] Unsupervised Multi-Target Domain Adaptation Through Knowledge Distillation
    Nguyen-Meidine, L. T.
    Belal, A.
    Kiran, M.
    Dolz, J.
    Blais-Morin, L-A
    Granger, E.
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 1338 - 1346
  • [30] Knowledge Distillation Facilitates the Lightweight and Efficient Plant Diseases Detection Model
    Huang, Qianding
    Wu, Xingcai
    Wang, Qi
    Dong, Xinyu
    Qin, Yongbin
    Wu, Xue
    Gao, Yangyang
    Hao, Gefei
    PLANT PHENOMICS, 2023, 5