Asymmetrical Contrastive Learning Network via Knowledge Distillation for No-Service Rail Surface Defect Detection

被引:1
|
作者
Zhou, Wujie [1 ,2 ]
Sun, Xinyu [1 ]
Qian, Xiaohong [1 ]
Fang, Meixin [3 ]
机构
[1] Zhejiang Univ Sci & Technol, Sch Informat & Elect Engn, Hangzhou 310023, Peoples R China
[2] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 308232, Singapore
[3] Zhejiang Univ, Sch Med, Hangzhou 310003, Peoples R China
基金
中国国家自然科学基金;
关键词
Biological system modeling; Contrastive learning; Feature extraction; Adaptation models; Rails; Computational modeling; Neural networks; Defect detection; Decoding; Convolution; graph mapping distillation; knowledge distillation (KD); rail surface defect detection (SDD); SALIENT OBJECT DETECTION; REFINEMENT; FUSION;
D O I
10.1109/TNNLS.2024.3479453
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Owing to extensive research on deep learning, significant progress has recently been made in trackless surface defect detection (SDD). Nevertheless, existing algorithms face two main challenges. First, while depth features contain rich spatial structure features, most models only accept red- green-blue (RGB) features as input, which severely constrains performance. Thus, this study proposes a dual-stream teacher model termed the asymmetrical contrastive learning network (ACLNet-T), which extracts both RGB and depth features to achieve high performance. Second, the introduction of the dual-stream model facilitates an exponential increase in the number of parameters. As a solution, we designed a single-stream student model (ACLNet-S) that extracted RGB features. We leveraged a contrastive distillation loss via knowledge distillation (KD) techniques to transfer rich multimodal features from the ACLNet-T to the ACLNet-S pixel by pixel and channel by channel. Furthermore, to compensate for the lack of contrastive distillation loss that focuses exclusively on local features, we employed multiscale graph mapping to establish long-range dependencies and transfer global features to the ACLNet-S through multiscale graph mapping distillation loss. Finally, an attentional distillation loss based on the adaptive attention decoder (AAD) was designed to further improve the performance of the ACLNet-S. Consequently, we obtained the ACLNet-S*, which achieved performance similar to that of ACLNet-T, despite having a nearly eightfold parameter count gap. Through comprehensive experimentation using the industrial RGB-D dataset NEU RSDDS-AUG, the ACLNet-S* (ACLNet-S with KD) was confirmed to outperform 16 state-of-the-art methods. Moreover, to showcase the generalization capacity of ACLNet-S*, the proposed network was evaluated on three additional public datasets, and ACLNet-S* achieved comparable results.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Rail Surface Defect Detection Method Based on Few-shot Learning
    Liu J.
    Du X.
    Wang S.
    Gu Z.
    Wang F.
    Dai P.
    Tiedao Xuebao/Journal of the China Railway Society, 2022, 44 (07): : 72 - 79
  • [42] Collaborative deep semi-supervised learning with knowledge distillation for surface defect classification
    Manivannan, Siyamalan
    COMPUTERS & INDUSTRIAL ENGINEERING, 2023, 186
  • [43] Contrastive self-supervised representation learning framework for metal surface defect detection
    Zabin, Mahe
    Kabir, Anika Nahian Binte
    Kabir, Muhammad Khubayeeb
    Choi, Ho-Jin
    Uddin, Jia
    JOURNAL OF BIG DATA, 2023, 10 (01)
  • [44] Contrastive self-supervised representation learning framework for metal surface defect detection
    Mahe Zabin
    Anika Nahian Binte Kabir
    Muhammad Khubayeeb Kabir
    Ho-Jin Choi
    Jia Uddin
    Journal of Big Data, 10
  • [45] Knowledge distillation via adaptive meta-learning for graph neural network
    Shen, Tiesunlong
    Wang, Jin
    Zhang, Xuejie
    INFORMATION SCIENCES, 2025, 689
  • [46] Self-Supervised Defect Representation Learning for Label-Limited Rail Surface Defect Detection
    Xu, Yanggang
    Wang, Huan
    Liu, Zhiliang
    Zuo, Mingjian
    IEEE SENSORS JOURNAL, 2023, 23 (23) : 29235 - 29246
  • [47] One-stage object detection knowledge distillation via adversarial learning
    Dong, Na
    Zhang, Yongqiang
    Ding, Mingli
    Xu, Shibiao
    Bai, Yancheng
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4582 - 4598
  • [48] One-stage object detection knowledge distillation via adversarial learning
    Na Dong
    Yongqiang Zhang
    Mingli Ding
    Shibiao Xu
    Yancheng Bai
    Applied Intelligence, 2022, 52 : 4582 - 4598
  • [49] CSANet: Contour and Semantic Feature Alignment Fusion Network for Rail Surface Defect Detection
    Yang, Jinxin
    Zhou, Wujie
    Wu, Ruiming
    Fang, Meixin
    IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 972 - 976
  • [50] A Defect Detection Method for Rail Surface and Fasteners Based on Deep Convolutional Neural Network
    Zheng, Danyang
    Li, Liming
    Zheng, Shubin
    Chai, Xiaodong
    Zhao, Shuguang
    Tong, Qianqian
    Wang, Ji
    Guo, Lizheng
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021