Segmentation of lung parenchyma based on new U-NET network

被引:0
|
作者
Cheng L. [1 ]
Jiang L. [1 ]
Wang X. [1 ]
Liu Z. [1 ]
Zhao S. [2 ]
机构
[1] School of Physical Science and Technology, Shenyang Normal University, Shenyang
[2] School of Information Engineering, Southwest University of Science and Technology, Fucheng District, Sichuan Province, Mianyang City
关键词
CT images of lung; deep learning; lung parenchymal segmentation; new U-NET;
D O I
10.1504/ijwmc.2022.126380
中图分类号
学科分类号
摘要
As the risk of lung disease increases in people’s daily lives and COVID-19 spreads around the world, lung screening has become critical. Owing to the unique lung tissue, traditional image segmentation methods are difficult to achieve accurate segmentation of lung tissues. In view of the complexity of lung tissue structure, it was found in the experiment that the segmentation accuracy of upper lung and lower lung parenchyma tissue was low. Aiming at this phenomenon, a new network model, new U-NET, was proposed based on the improvement and optimisation of U-NET network model. Experimental data show that the proposed new U-NET network model solves the problem of low segmentation accuracy of the original U-NET network segmentation model at both ends of lung, improves the segmentation accuracy of lung parenchyma on the whole, and verifies that the new U-NET network model is more suitable for parenchyma segmentation. Copyright © 2022 Inderscience Enterprises Ltd.
引用
收藏
页码:173 / 182
页数:9
相关论文
共 50 条
  • [41] Evolutionary U-Net for lung cancer segmentation on medical images
    Sahapudeen, Farjana Farvin
    Mohan, S. Krishna
    Journal of Intelligent and Fuzzy Systems, 2024, 46 (02): : 3963 - 3974
  • [42] Evolutionary U-Net for lung cancer segmentation on medical images
    Sahapudeen, Farjana Farvin
    Mohan, S. Krishna
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2024, 46 (02) : 3963 - 3974
  • [43] WEU-Net: A Weight Excitation U-Net for Lung Nodule Segmentation
    Furruka Banu, Syeda
    Sarker, Md Mostafa Kamal
    Abdel-Nasser, Mohamed
    Rashwan, Hatem A.
    Puig, Domenec
    ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2021, 339 : 349 - 356
  • [44] Lung Nodule Segmentation and Classification using U-Net and Efficient-Net
    Suriyavarman, S.
    Annie, R. Arockia Xavier
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (07) : 737 - 745
  • [45] Artificially-generated consolidations and balanced augmentation increase performance of U-net for lung parenchyma segmentation on MR images
    Crisosto, Cristian
    Voskrebenzev, Andreas
    Gutberlet, Marcel
    Klimes, Filip
    Kaireit, Till F.
    Poehler, Gesa
    Moher, Tawfik
    Behrendt, Lea
    Mueller, Robin
    Zubke, Maximilian
    Wacker, Frank
    Vogel-Claussen, Jens
    PLOS ONE, 2023, 18 (05):
  • [46] An effective U-Net and BiSeNet complementary network for spine segmentation
    Deng, Yunjiao
    Gu, Feng
    Zeng, Daxing
    Lu, Junyan
    Liu, Haitao
    Hou, Yulei
    Zhang, Qinghua
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 89
  • [47] Design of Superpiexl U-Net Network for Medical Image Segmentation
    Wang H.
    Liu H.
    Guo Q.
    Deng K.
    Zhang C.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2019, 31 (06): : 1007 - 1017
  • [48] Estimation of Preterm Birth Markers with U-Net Segmentation Network
    Wlodarczyk, Tomasz
    Plotka, Szymon
    Trzcinski, Tomasz
    Rokita, Przemyslaw
    Sochacki-Wojcicka, Nicole
    Lipa, Michal
    Wojcicki, Jakub
    SMART ULTRASOUND IMAGING AND PERINATAL, PRETERM AND PAEDIATRIC IMAGE ANALYSIS, SUSI 2019, PIPPI 2019, 2019, 11798 : 95 - 103
  • [49] Enhanced U-Net: A Feature Enhancement Network for Polyp Segmentation
    Patel, Krushi
    Bur, Andres M.
    Wang, Guanghui
    2021 18TH CONFERENCE ON ROBOTS AND VISION (CRV 2021), 2021, : 181 - 188
  • [50] Fuzzy U-Net Neural Network Design for Image Segmentation
    Kirichev, Mark
    Slavov, Todor
    Momcheva, Galina
    CONTEMPORARY METHODS IN BIOINFORMATICS AND BIOMEDICINE AND THEIR APPLICATIONS, 2022, 374 : 177 - 184