Skin cancer diagnosis based on deep transfer learning and sparrow search algorithm

被引:0
|
作者
Hossam Magdy Balaha
Asmaa El-Sayed Hassan
机构
[1] Mansoura University,Computer Engineering and Systems Department, Faculty of Engineering
[2] Mansoura University,Mathematics and Engineering Physics Department, Faculty of Engineering
来源
关键词
Skin cancer; Melanoma cancer; Non-melanoma cancer; Convolution neural network (CNN); Deep learning (DL); Meta-heuristic optimization; Segmentation; Sparrow search algorithm (SpaSA);
D O I
暂无
中图分类号
学科分类号
摘要
Skin cancer affects the lives of millions of people every year, as it is considered the most popular form of cancer. In the USA alone, approximately three and a half million people are diagnosed with skin cancer annually. The survival rate diminishes steeply as the skin cancer progresses. Despite this, it is an expensive and difficult procedure to discover this cancer type in the early stages. In this study, a threshold-based automatic approach for skin cancer detection, classification, and segmentation utilizing a meta-heuristic optimizer named sparrow search algorithm (SpaSA) is proposed. Five U-Net models (i.e., U-Net, U-Net++, Attention U-Net, V-net, and Swin U-Net) with different configurations are utilized to perform the segmentation process. Besides this, the meta-heuristic SpaSA optimizer is used to perform the optimization of the hyperparameters using eight pre-trained CNN models (i.e., VGG16, VGG19, MobileNet, MobileNetV2, MobileNetV3Large, MobileNetV3Small, NASNetMobile, and NASNetLarge). The dataset is gathered from five public sources in which two types of datasets are generated (i.e., 2-classes and 10-classes). For the segmentation, concerning the “skin cancer segmentation and classification” dataset, the best reported scores by U-Net++ with DenseNet201 as a backbone architecture are 0.104, 94.16%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$94.16\%$$\end{document}, 91.39%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$91.39\%$$\end{document}, 99.03%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$99.03\%$$\end{document}, 96.08%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$96.08\%$$\end{document}, 96.41%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$96.41\%$$\end{document}, 77.19%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$77.19\%$$\end{document}, 75.47%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$75.47\%$$\end{document} in terms of loss, accuracy, F1-score, AUC, IoU, dice, hinge, and squared hinge, respectively, while for the “PH2” dataset, the best reported scores by the Attention U-Net with DenseNet201 as backbone architecture are 0.137, 94.75%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$94.75\%$$\end{document}, 92.65%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$92.65\%$$\end{document}, 92.56%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$92.56\%$$\end{document}, 92.74%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$92.74\%$$\end{document}, 96.20%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$96.20\%$$\end{document}, 86.30%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$86.30\%$$\end{document}, 92.65%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$92.65\%$$\end{document}, 69.28%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$69.28\%$$\end{document}, and 68.04%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$68.04\%$$\end{document} in terms of loss, accuracy, F1-score, precision, sensitivity, specificity, IoU, dice, hinge, and squared hinge, respectively. For the “ISIC 2019 and 2020 Melanoma” dataset, the best reported overall accuracy from the applied CNN experiments is 98.27%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$98.27\%$$\end{document} by the MobileNet pre-trained model. Similarly, for the “Melanoma Classification (HAM10K)” dataset, the best reported overall accuracy from the applied CNN experiments is 98.83%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$98.83\%$$\end{document} by the MobileNet pre-trained model. For the “skin diseases image” dataset, the best reported overall accuracy from the applied CNN experiments is 85.87%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$85.87\%$$\end{document} by the MobileNetV2 pre-trained model. After computing the results, the suggested approach is compared with 13 related studies.
引用
收藏
页码:815 / 853
页数:38
相关论文
共 50 条
  • [21] Fault diagnosis of track circuit based on improved sparrow search algorithm and Q-Learning optimization for ensemble learning
    Xu K.
    Zheng H.
    Tu Y.
    Wu S.
    Journal of Railway Science and Engineering, 2023, 20 (11) : 4426 - 4437
  • [22] RETRACTION: Optimal brain tumor diagnosis based on deep learning and balanced sparrow search algorithm (Retraction of Vol 31, Pg 1921, 2021)
    Liu, T.
    Yuan, Z.
    Wu, L.
    Badami, B.
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2024, 34 (02)
  • [23] Chaotic Sparrow Search Algorithm Based on Multi-Directional Learning
    Chai, Yan
    Sun, Xiaoxiao
    Ren, Sheng
    Computer Engineering and Applications, 2023, 59 (06) : 81 - 91
  • [24] Crow search algorithm with deep transfer learning driven skin lesion detection on dermoscopic images
    Mishra, Awanish Kumar
    Diwan, Tarun Dhar
    Gupta, Indresh Kumar
    Agrawal, Sonu
    INTELLIGENT DECISION TECHNOLOGIES-NETHERLANDS, 2024, 18 (01): : 417 - 426
  • [25] Lightning Search Algorithm with Deep Transfer Learning-Based Vehicle Classification
    Alnfiai, Mrim M.
    CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 74 (03): : 6505 - 6521
  • [26] Integrating Sparrow Search Algorithm with Deep Learning for Tomato Fruit Disease Detection and Classification
    Sundaramoorthi, K.
    Kamaras, Mari
    2024 4TH INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND SOCIAL NETWORKING, ICPCSN 2024, 2024, : 184 - 190
  • [27] Automated Image Captioning Using Sparrow Search Algorithm With Improved Deep Learning Model
    Arasi, Munya A.
    Alshahrani, Haya Mesfer
    Alruwais, Nuha
    Motwakel, Abdelwahed
    Ahmed, Noura Abdelaziz
    Mohamed, Abdullah
    IEEE ACCESS, 2023, 11 : 104633 - 104642
  • [28] A Deep-Ensemble-Learning-Based Approach for Skin Cancer Diagnosis
    Shehzad, Khurram
    Zhenhua, Tan
    Shoukat, Shifa
    Saeed, Adnan
    Ahmad, Ijaz
    Bhatti, Shahzad Sarwar
    Chelloug, Samia Allaoua
    ELECTRONICS, 2023, 12 (06)
  • [29] Multistrategy Improved Sparrow Search Algorithm Optimized Deep Neural Network for Esophageal Cancer
    Wang, Yanfeng
    Liu, Qing
    Sun, Junwei
    Wang, Lidong
    Song, Xin
    Zhao, Xueke
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [30] An improved sparrow search algorithm based on levy flight and opposition-based learning
    Chen, Danni
    Zhao, JianDong
    Huang, Peng
    Deng, Xiongna
    Lu, Tingting
    ASSEMBLY AUTOMATION, 2021, 41 (06) : 697 - 713