Abnormality Detection of Blast Furnace Tuyere Based on Knowledge Distillation and a Vision Transformer

被引:2
|
作者
Song, Chuanwang [1 ]
Zhang, Hao [1 ]
Wang, Yuanjun [1 ]
Wang, Yuhui [1 ]
Hu, Keyong [1 ]
机构
[1] Qingdao Univ Technol, Sch Informat & Control Engn, Qingdao 266520, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 18期
关键词
transformer; cnn; knowledge distillation; self-attention mechanism; image classification;
D O I
10.3390/app131810398
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
The blast furnace tuyere is a key position in hot metal production and is primarily observed to assess the internal state of the furnace. However, detecting abnormal tuyere conditions has relied heavily on manual judgment, leading to certain limitations. We proposed a tuyere abnormality detection model based on knowledge distillation and a vision transformer (ViT) to address this issue. In this approach, ResNet50 is employed as the Teacher model to distill knowledge into the Student model, ViT. Firstly, we introduced spatial attention modules to enhance the model's perception and feature-extraction capabilities for different image regions. Furthermore, we simplified the depth of the ViT and improved its self-attention mechanism to alleviate training losses. In addition, we employed the knowledge distillation strategy to achieve model lightweighting and enhance the model's generalization capability. Finally, we evaluate the model's performance in tuyere abnormality detection and compare it with other classification methods such as VGG-19, ResNet-101, and ResNet-50. Experimental results showed that our model achieved a classification accuracy of 97.86% on a tuyere image dataset from a company, surpassing the original ViT model by 1.12% and the improved ViT model without knowledge distillation by 0.34%. Meanwhile, the model achieved a competitive classification accuracy of 90.31% and 77.65% on the classical fine-grained image datasets, Stanford Dogs and CUB-200-2011, respectively, comparable to other classification models.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Fire Detection Approach Based on Vision Transformer
    Khudayberdiev, Otabek
    Zhang, Jiashu
    Elkhalil, Ahmed
    Balde, Lansana
    ARTIFICIAL INTELLIGENCE AND SECURITY, ICAIS 2022, PT I, 2022, 13338 : 41 - 53
  • [22] Lightweight Underwater Target Detection Algorithm Based on Dynamic Sampling Transformer and Knowledge-Distillation Optimization
    Chen, Liang
    Yang, Yuyi
    Wang, Zhenheng
    Zhang, Jian
    Zhou, Shaowu
    Wu, Lianghong
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (02)
  • [23] Computer Vision Technology Based on Sensor Data and Hybrid Deep Learning for Security Detection of Blast Furnace Bearing
    Yang, Ai-Min
    Zhi, Jian-Ming
    Yang, Ke
    Wang, Jia-Hao
    Xue, Tao
    IEEE SENSORS JOURNAL, 2021, 21 (22) : 24982 - 24992
  • [24] Hydrophobicity-Based Grading of Industrial Composite Insulators Images Using Cross Attention Vision Transformer With Knowledge Distillation
    Das, Samiran
    Chatterjee, Sujoy
    Basu, Mainak
    IEEE TRANSACTIONS ON DIELECTRICS AND ELECTRICAL INSULATION, 2024, 31 (01) : 523 - 532
  • [25] Diameter Selection of Blast Furnace Tuyeres Based on the Rate and Energy of the Fuel-Enriched Blast and Tuyere Gas Flows with the Injection of the Pulverized Coal
    Lyalyuk V.P.
    Steel in Translation, 2021, 51 (09) : 627 - 639
  • [26] KDFAS: Multi-stage Knowledge Distillation Vision Transformer for Face Anti-spoofing
    Zhang, Jun
    Zhang, Yunfei
    Shao, Feixue
    Ma, Xuetao
    Zhou, Daoxiang
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT V, 2024, 14429 : 159 - 171
  • [27] Prediction method of furnace temperature based on transfer learning and knowledge distillation
    Zhai N.
    Zhou X.F.
    Li S.
    Shi H.
    Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2022, 28 (06): : 1860 - 1869
  • [28] KDViT: COVID-19 diagnosis on CT-scans with knowledge distillation of vision transformer
    Lim, Yu Jie
    Lim, Kian Ming
    Chang, Roy Kwang Yang
    Lee, Chin Poo
    AUTOMATIKA, 2024, 65 (03) : 1113 - 1126
  • [29] Defect Detection Method Based on Knowledge Distillation
    Zhou, Qunying
    Wang, Hongyuan
    Tang, Ying
    Wang, Yang
    IEEE ACCESS, 2023, 11 : 35866 - 35873
  • [30] A Transformer-Based Knowledge Distillation Network for Cortical Cataract Grading
    Wang, Jinhong
    Xu, Zhe
    Zheng, Wenhao
    Ying, Haochao
    Chen, Tingting
    Liu, Zuozhu
    Chen, Danny Z.
    Yao, Ke
    Wu, Jian
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2024, 43 (03) : 1089 - 1101