A sensitivity analysis for polyp segmentation with U-Net

被引:7
|
作者
Solak, Ahmet [1 ]
Ceylan, Rahime [1 ]
机构
[1] Konya Tech Univ, Konya, Turkiye
关键词
Polyp segmentation; u-net; k-fold cross validation; Loss function; VALIDATION; IMAGES;
D O I
10.1007/s11042-023-16368-9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Colorectal Cancer (CRC) is one of the most common cancer diseases in the world. Early diagnosis of the disease is of great importance for the recovery of the patient. Colonoscopy is the gold standard procedure used in the diagnosis of CRC. In this context, this study focused on the detection of polyps with high accuracy in order to contribute to the early diagnosis of CRC. Within the scope of the study, polyp segmentation was performed on the public CVC-Clinic DB polyp dataset. In the study, the basic U-Net model and its derivatives (modified U-Net, modified U-Net with transfer learning (VGG-16, VGG-19) in the encoding part) were used for the segmentation process. For sensitivity analysis, models were trained on three separate datasets prepared with different preprocessing methods in addition to the raw dataset with k-fold cross validations (k = 2,3,4) and different batch numbers (1,2,3,4,5) in each cross validation. As a result of the analysis, the best performance was obtained as 0.868, 0.799, 0.873 and 0.994 for Dice, Jaccard, Sensitivity, Specificity when the batch size was taken as 1 with fourfold cross validation in the modified U-Net trained with the Discrete Wavelet Transform (DWT) dataset. This model and its parameters were then tested with public datasets Kvasir-Seg and Etis-Larib Polyp DB. Moreover, different models were trained with the parameters of the most successful model. The results of all analyzes were interpreted and compared with the literature.
引用
收藏
页码:34199 / 34227
页数:29
相关论文
共 50 条
  • [31] Extended U-net for Retinal Vessel Segmentation
    Boudegga, Henda
    Elloumi, Yaroub
    Kachouri, Rostom
    Ben Abdallah, Asma
    Bedoui, Mohamed Hedi
    ADVANCES IN COMPUTATIONAL COLLECTIVE INTELLIGENCE, ICCCI 2022, 2022, 1653 : 564 - 576
  • [32] Tuning U-Net for Brain Tumor Segmentation
    Futrega, Michal
    Marcinkiewicz, Michal
    Ribalta, Pablo
    BRAINLESION: GLIOMA, MULTIPLE SCLEROSIS, STROKE AND TRAUMATIC BRAIN INJURIES, BRAINLES 2022, 2023, 13769 : 162 - 173
  • [33] A Probabilistic U-Net for Segmentation of Ambiguous Images
    Kohl, Simon A. A.
    Romera-Paredes, Bernardino
    Meyer, Clemens
    De Fauw, Jeffrey
    Ledsam, Joseph R.
    Maier-Hein, Klaus H.
    Eslami, S. M. Ali
    Rezende, Danilo Jimenez
    Ronneberger, Olaf
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [34] Mosaic Images Segmentation using U-net
    Fenu, Gianfranco
    Medvet, Eric
    Panfilo, Daniele
    Pellegrino, Felice Andrea
    ICPRAM: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2020, : 485 - 492
  • [35] TransAttention U-Net for Semantic Segmentation of Poppy
    Luo, Zifei
    Yang, Wenzhu
    Gou, Ruru
    Yuan, Yunfeng
    ELECTRONICS, 2023, 12 (03)
  • [36] Improved U-NET Semantic Segmentation Network
    Gao, Xueyan
    Fang, Lijin
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7090 - 7095
  • [37] BCT boost segmentation with U-net in tensorflow
    Wieczorek, Grzegorz
    Antoniuk, Izabella
    Kurek, Jaroslaw
    Świderski, Bartosz
    Kruk, Michal
    Pach, Jakub
    Orlowski, Arkadiusz
    Machine Graphics and Vision, 2019, 28 (1-4): : 25 - 34
  • [38] Optimized U-Net for Brain Tumor Segmentation
    Futrega, Michal
    Milesi, Alexandre
    Marcinkiewicz, Michal
    Ribalta, Pablo
    BRAINLESION: GLIOMA, MULTIPLE SCLEROSIS, STROKE AND TRAUMATIC BRAIN INJURIES, BRAINLES 2021, PT II, 2022, 12963 : 15 - 29
  • [39] Biomedical Image Segmentation with Modified U-Net
    Tatli, Umut
    Budak, Cafer
    TRAITEMENT DU SIGNAL, 2023, 40 (02) : 523 - 531
  • [40] Superpixel Combining U-NET for Pancreas Segmentation
    Cao Z.
    Qiao N.
    Bu Q.
    Feng J.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2019, 31 (10): : 1777 - 1785