Extracting Weld Bead Shapes from Radiographic Testing Images with U-Net

被引:10
|
作者
Jin, Gang-soo [1 ]
Oh, Sang-jin [1 ]
Lee, Yeon-seung [2 ]
Shin, Sung-chul [1 ]
机构
[1] Pusan Natl Univ, Dept Naval Architecture & Ocean Engn, Busan 46241, South Korea
[2] Hongik Univ, Dept Naval Architecture & Ocean Engn, Sejong 30016, South Korea
来源
APPLIED SCIENCES-BASEL | 2021年 / 11卷 / 24期
关键词
deep learning; image segmentation; weld bead;
D O I
10.3390/app112412051
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Metals created by melting basic metal and welding rods in welding operations are referred to as weld beads. The weld bead shape allows the observation of pores and defects such as cracks in the weld zone. Radiographic testing images are used to determine the quality of the weld zone. The extraction of only the weld bead to determine the generative pattern of the bead can help efficiently locate defects in the weld zone. However, manual extraction of the weld bead from weld images is not time and cost-effective. Efficient and rapid welding quality inspection can be conducted by automating weld bead extraction through deep learning. As a result, objectivity can be secured in the quality inspection and determination of the weld zone in the shipbuilding and offshore plant industry. This study presents a method for detecting the weld bead shape and location from the weld zone image using image preprocessing and deep learning models, and extracting the weld bead through image post-processing. In addition, to diversify the data and improve the deep learning performance, data augmentation was performed to artificially expand the image data. Contrast limited adaptive histogram equalization (CLAHE) is used as an image preprocessing method, and the bead is extracted using U-Net, a pixel-based deep learning model. Consequently, the mean intersection over union (mIoU) values are found to be 90.58% and 85.44% in the train and test experiments, respectively. Successful extraction of the bead from the radiographic testing image through post-processing is achieved.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] DUFuse: Deep U-Net for visual and infrared images fusion
    Pan Y.
    Pi D.
    Khan I.A.
    Meng H.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (09) : 12549 - 12561
  • [42] U-net based defects inspection in photovoltaic electroluminecscence images
    Rahman, Muhammad Rameez Ur
    Chen, Haiyong
    Xi, Wen
    2019 10TH IEEE INTERNATIONAL CONFERENCE ON BIG KNOWLEDGE (ICBK 2019), 2019, : 215 - 220
  • [43] Exudate Detection with Improved U-Net Using Fundus Images
    Mohan, N. Jagan
    Murugan, R.
    Goel, Tripti
    Roy, Parthapratim
    2021 INTERNATIONAL CONFERENCE ON COMPUTATIONAL PERFORMANCE EVALUATION (COMPE-2021), 2021, : 560 - 564
  • [44] Group Equivariant U-Net for the Semantic Segmentation of SAR Images
    Turkmenli, Ilter
    Aptoula, Erchan
    Kayabol, Koray
    2022 30TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2022,
  • [45] Modality preserving U-Net for segmentation of multimodal medical images
    Wu, Bingxuan
    Zhang, Fan
    Xu, Liang
    Shen, Shuwei
    Shao, Pengfei
    Sun, Mingzhai
    Liu, Peng
    Yao, Peng
    Xu, Ronald X.
    QUANTITATIVE IMAGING IN MEDICINE AND SURGERY, 2023, 13 (08) : 5242 - 5257
  • [46] Foreground Objects Detection by U-Net with Multiple Difference Images
    Kim, Jae-Yeul
    Ha, Jong-Eun
    APPLIED SCIENCES-BASEL, 2021, 11 (04): : 1 - 19
  • [47] Dual U-Net with Resnet Encoder for Segmentation of Medical Images
    Nisa, Syed Qamrun
    Ismail, Amelia Ritahani
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (12) : 537 - 542
  • [48] Nerve Segmentation of Ultrasound Images Bayesian U-Net Models
    Michael, Taryn
    Obagbuwa, Ibidun Christiana
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2024, 2024 (01)
  • [49] Comparing U-Net Based Models for Denoising Color Images
    Komatsu, Rina
    Gonsalves, Tad
    AI, 2020, 1 (04) : 465 - 486
  • [50] A More Streamlined U-net for Nerve Segmentation in Ultrasound Images
    Wang, Yun
    Wei, Chang
    Wang, Zhi-jian
    Lu, Qing-gao
    Wang, Cheng-gang
    2018 CHINESE AUTOMATION CONGRESS (CAC), 2018, : 101 - 104