Development of retake support system for lateral knee radiographs by using deep convolutional neural network

被引:3
|
作者
Ohta, Y. [1 ]
Matsuzawa, H. [2 ]
Yamamoto, K. [3 ]
Enchi, Y. [2 ]
Kobayashi, T. [3 ]
Ishida, T. [3 ]
机构
[1] Osaka City Univ Hosp, Div Premier Prevent Med, MedCity21, Abeno Ku, Abeno Harukasu 21F,Abenosuji 1-1-43, Osaka 5458545, Japan
[2] Osaka Univ Hosp, Dept Radiol, Yamadaoka 2-15, Suita, Osaka 5650871, Japan
[3] Osaka Univ, Grad Sch Med, Dept Med Phys & Engn, Yamadaoka 1-7, Suita, Osaka 5650871, Japan
关键词
Deep learning; Deep convolutional neural network; Transfer learning; Raysum image; Lateral knee radiograph; Retake; REJECT ANALYSIS; EXPOSURE;
D O I
10.1016/j.radi.2021.05.002
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Introduction: Lateral radiography of the knee joint is frequently performed; however, the retake rate is high owing to positioning errors. Therefore, in this study, to reduce the required number and time of image retakes, we developed a system that can classify the tilting directions of lateral knee radiographs and evaluated the accuracy of the proposed method. Methods: Using our system, the tilting directions of a lateral knee radiographs were classified into four direction categories. The system was developed by training the DCNN based on 50 cases of Raysum images and tested on three types test dataset; ten more cases of Raysum images, one case of flexed knee joint phantom images and 14 rejected knee joint radiographs. To train a deep convolutional neural network (DCNN), we employed Raysum images created via three-dimensional (3D) X-ray computed tomography (CT); 11 520 Raysum images were created from 60 cases of 3D CT data by changing the projection angles. Thereby, we obtained pseudo images attached with correct labels that are essential for training. Results: The overall accuracy on each test dataset was 88.5 +/- 7.0% (mean +/- standard deviation), 81.4 +/- 11.2%, and 73.3 +/- 9.2%. The larger the tilting degree of the knee joint, the higher the classification accuracy. Conclusion: DCNN could classify the tilting directions of a knee joint from lateral knee radiographs. Using Raysum images made it possible to facilitate creating dataset for training DCNN. The possibility was indicated for using support system of lateral knee radiographs. Implications for practice: The system may also reduce the burden on patients and increase the work efficiency of radiological technologists. (C) 2021 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:1110 / 1117
页数:8
相关论文
共 50 条
  • [21] Development of an Abaca Fiber Automated Grading System Using Computer Vision and Deep Convolutional Neural Network
    Hong, Jonel
    Caya, Meo Vincent C.
    2021 IEEE REGION 10 CONFERENCE (TENCON 2021), 2021, : 99 - 104
  • [22] Development and Validation of a Deep Learning Model Using Convolutional Neural Networks to Identify Scaphoid Fractures in Radiographs
    Yoon, Alfred P.
    Lee, Yi-Lun
    Kane, Robert L.
    Kuo, Chang-Fu
    Lin, Chihung
    Chung, Kevin C.
    JAMA NETWORK OPEN, 2021, 4 (05)
  • [23] Deep learning for automatic detection of cephalometric landmarks on lateral cephalometric radiographs using the Mask Region-based Convolutional Neural Network: a pilot study
    Jiao, Zhentao
    Liang, Zhuangzhuang
    Liao, Qian
    Chen, Sheng
    Yang, Hui
    Hong, Guang
    Gui, Haijun
    ORAL SURGERY ORAL MEDICINE ORAL PATHOLOGY ORAL RADIOLOGY, 2024, 137 (05): : 554 - 562
  • [24] Development of Vegetation Mapping with Deep Convolutional Neural Network
    Suh, Sae-Han
    Jhang, Ji-Eun
    Won, Kwanghee
    Shin, Sung-Y.
    Sung, Chang Oan
    PROCEEDINGS OF THE 2018 CONFERENCE ON RESEARCH IN ADAPTIVE AND CONVERGENT SYSTEMS (RACS 2018), 2018, : 53 - 58
  • [25] Assessment of deep convolutional neural network models for mandibular fracture detection in panoramic radiographs
    Warin, K.
    Limprasert, W.
    Suebnukarn, S.
    Inglam, S.
    Jantana, P.
    Vicharueang, S.
    INTERNATIONAL JOURNAL OF ORAL AND MAXILLOFACIAL SURGERY, 2022, 51 (11) : 1488 - 1494
  • [26] Application of a fully deep convolutional neural network to the automation of tooth segmentation on panoramic radiographs
    Lee, Jeong-Hee
    Han, Sang-Sun
    Kim, Young Hyun
    Lee, Chena
    Kim, Inhyeok
    ORAL SURGERY ORAL MEDICINE ORAL PATHOLOGY ORAL RADIOLOGY, 2020, 129 (06): : 635 - 642
  • [27] Automated Detection of Periodontal Bone Loss Using Deep Learning and Panoramic Radiographs: A Convolutional Neural Network Approach
    Ryu, Jihye
    Lee, Dong-Min
    Jung, Yun-Hoa
    Kwon, OhJin
    Park, SunYoung
    Hwang, JaeJoon
    Lee, Jae-Yeol
    APPLIED SCIENCES-BASEL, 2023, 13 (09):
  • [28] Development of convolutional neural network model for diagnosing osteochondral lesions of the talus using anteroposterior ankle radiographs
    Shin, Hyunkwang
    Park, Donghwi
    Kim, Jeoung Kun
    Choi, Gyu Sang
    Chang, Min Cheol
    MEDICINE, 2023, 102 (19) : E33796
  • [29] A Systematic Search over Deep Convolutional Neural Network Architectures for Screening Chest Radiographs
    Mitra, Arka
    Chakravarty, Arunava
    Ghosh, Nirmalya
    Sarkar, Tandra
    Sethuraman, Ramanathan
    Sheet, Debdoot
    42ND ANNUAL INTERNATIONAL CONFERENCES OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY: ENABLING INNOVATIVE TECHNOLOGIES FOR GLOBAL HEALTHCARE EMBC'20, 2020, : 1225 - 1228
  • [30] Diagnosis of interproximal caries lesions with deep convolutional neural network in digital bitewing radiographs
    Bayraktar, Yusuf
    Ayan, Enes
    CLINICAL ORAL INVESTIGATIONS, 2022, 26 (01) : 623 - 632