Asymmetrical Contrastive Learning Network via Knowledge Distillation for No-Service Rail Surface Defect Detection

被引:1
|
作者
Zhou, Wujie [1 ,2 ]
Sun, Xinyu [1 ]
Qian, Xiaohong [1 ]
Fang, Meixin [3 ]
机构
[1] Zhejiang Univ Sci & Technol, Sch Informat & Elect Engn, Hangzhou 310023, Peoples R China
[2] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 308232, Singapore
[3] Zhejiang Univ, Sch Med, Hangzhou 310003, Peoples R China
基金
中国国家自然科学基金;
关键词
Biological system modeling; Contrastive learning; Feature extraction; Adaptation models; Rails; Computational modeling; Neural networks; Defect detection; Decoding; Convolution; graph mapping distillation; knowledge distillation (KD); rail surface defect detection (SDD); SALIENT OBJECT DETECTION; REFINEMENT; FUSION;
D O I
10.1109/TNNLS.2024.3479453
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Owing to extensive research on deep learning, significant progress has recently been made in trackless surface defect detection (SDD). Nevertheless, existing algorithms face two main challenges. First, while depth features contain rich spatial structure features, most models only accept red- green-blue (RGB) features as input, which severely constrains performance. Thus, this study proposes a dual-stream teacher model termed the asymmetrical contrastive learning network (ACLNet-T), which extracts both RGB and depth features to achieve high performance. Second, the introduction of the dual-stream model facilitates an exponential increase in the number of parameters. As a solution, we designed a single-stream student model (ACLNet-S) that extracted RGB features. We leveraged a contrastive distillation loss via knowledge distillation (KD) techniques to transfer rich multimodal features from the ACLNet-T to the ACLNet-S pixel by pixel and channel by channel. Furthermore, to compensate for the lack of contrastive distillation loss that focuses exclusively on local features, we employed multiscale graph mapping to establish long-range dependencies and transfer global features to the ACLNet-S through multiscale graph mapping distillation loss. Finally, an attentional distillation loss based on the adaptive attention decoder (AAD) was designed to further improve the performance of the ACLNet-S. Consequently, we obtained the ACLNet-S*, which achieved performance similar to that of ACLNet-T, despite having a nearly eightfold parameter count gap. Through comprehensive experimentation using the industrial RGB-D dataset NEU RSDDS-AUG, the ACLNet-S* (ACLNet-S with KD) was confirmed to outperform 16 state-of-the-art methods. Moreover, to showcase the generalization capacity of ACLNet-S*, the proposed network was evaluated on three additional public datasets, and ACLNet-S* achieved comparable results.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Adaptive Cross Transformer With Contrastive Learning for Surface Defect Detection
    Huang, Xiaohua
    Li, Yang
    Bao, Yongqiang
    Zheng, Wenming
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [22] Knowledge Distillation for Single Image Super-Resolution via Contrastive Learning
    Liu, Cencen
    Zhang, Dongyang
    Qin, Ke
    PROCEEDINGS OF THE 4TH ANNUAL ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2024, 2024, : 1079 - 1083
  • [23] Student Network Learning via Evolutionary Knowledge Distillation
    Zhang, Kangkai
    Zhang, Chunhui
    Li, Shikun
    Zeng, Dan
    Ge, Shiming
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (04) : 2251 - 2263
  • [24] A deep convolutional neural network for detection of rail surface defect
    Yuan, Hao
    Chen, Hao
    Liu, ShiWang
    Lin, Jun
    Luo, Xiao
    2019 IEEE VEHICLE POWER AND PROPULSION CONFERENCE (VPPC), 2019,
  • [25] An Improved Target Network Model for Rail Surface Defect Detection
    Zhang, Ye
    Feng, Tianshi
    Song, Yating
    Shi, Yuhang
    Cai, Guoqiang
    APPLIED SCIENCES-BASEL, 2024, 14 (15):
  • [26] Research on deep learning method for rail surface defect detection
    Feng, Jiang Hua
    Yuan, Hao
    Hu, Yun Qing
    Lin, Jun
    Liu, Shi Wang
    Luo, Xiao
    IET ELECTRICAL SYSTEMS IN TRANSPORTATION, 2020, 10 (04) : 436 - 442
  • [27] Surface Defect Detection Method of Lead Frame Based on Knowledge Distillation
    Li, Zhiwei
    Sun, Tingrui
    Du, Zhendong
    Hu, Xiangyang
    2024 4TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND INTELLIGENT SYSTEMS ENGINEERING, MLISE 2024, 2024, : 6 - 11
  • [28] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning
    Saurabh Sharma
    Shikhar Singh Lodhi
    Joydeep Chandra
    Applied Intelligence, 2023, 53 : 28520 - 28541
  • [29] SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning
    Sharma, Saurabh
    Lodhi, Shikhar Singh
    Chandra, Joydeep
    APPLIED INTELLIGENCE, 2023, 53 (23) : 28520 - 28541
  • [30] Cross-attention fusion and edge-guided fully supervised contrastive learning network for rail surface defect detectionCross-attention fusion and edge-guided fully supervised contrastive learning network for rail surface defect detectionJ. Yang and W. Zhou
    Jinxin Yang
    Wujie Zhou
    Applied Intelligence, 2025, 55 (6)