Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets

被引:10
|
作者
Spolaor, Newton [1 ]
Lee, Huei Diana [1 ]
Mendes, Ana Isabel [2 ]
Nogueira, Conceicao Veloso [2 ,3 ]
Sabino Parmezan, Antonio Rafael [1 ,4 ]
Resende Takaki, Weber Shoity [1 ]
Rodrigues Coy, Claudio Saddy [5 ]
Wu, Feng Chung [1 ,5 ]
Fonseca-Pinto, Rui [2 ,6 ,7 ]
机构
[1] Western Parana State Univ UNIOESTE, Lab Bioinformat, Presidente Tancredo Neves Ave 6731, BR-85867900 Foz Do Iguacu, Parana, Brazil
[2] Polytech Inst Leiria, Gen Norton Matos St 4133, P-2411901 Leiria, Portugal
[3] Univ Minho, Ctr Math, Braga, Portugal
[4] Univ Sao Paulo, Inst Math & Comp Sci, Lab Computat Intelligence, Sao Carlos, SP, Brazil
[5] Univ Estadual Campinas, Fac Med Sci, Serv Coloproctol, Campinas, SP, Brazil
[6] Polytech Inst Leiria, CiTechCare Ctr Innovat Care & Hlth Technol, Leiria, Portugal
[7] IT Inst Telecomunicacoes Leiria, Leiria, Portugal
关键词
Feature learning; Few-shot learning; RMSprop; Shallow learning; Statistical test; VGG; MELANOMA; THICKNESS; FEATURES; TEXTURE;
D O I
10.1007/s11042-023-16529-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional neural networks have been effective in several applications, arising as a promising supporting tool in a relevant Dermatology problem: skin cancer diagnosis. However, generalizing well can be difficult when little training data is available. The fine-tuning transfer learning strategy has been employed to differentiate properly malignant from non-malignant lesions in dermoscopic images. Fine-tuning a pre-trained network allows one to classify data in the target domain, occasionally with few images, using knowledge acquired in another domain. This work proposes eight fine-tuning settings based on convolutional networks previously trained on ImageNet that can be employed mainly in limited data samples to reduce overfitting risk. They differ on the architecture, the learning rate and the number of unfrozen layer blocks. We evaluated the settings in two public datasets with 104 and 200 dermoscopic images. By finding competitive configurations in small datasets, this paper illustrates that deep learning can be effective if one has only a few dozen malignant and non-malignant lesion images to study and differentiate in Dermatology. The proposal is also flexible and potentially useful for other domains. In fact, it performed satisfactorily in an assessment conducted in a larger dataset with 746 computerized tomographic images associated with the coronavirus disease.
引用
收藏
页码:27305 / 27329
页数:25
相关论文
共 50 条
  • [21] Fine-tuning Pre-trained Models for Robustness under Noisy Labels
    Ahn, Sumyeong
    Kim, Sihyeon
    Ko, Jongwoo
    Yun, Se-Young
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 3643 - 3651
  • [22] Debiasing Pre-Trained Language Models via Efficient Fine-Tuning
    Gira, Michael
    Zhang, Ruisu
    Lee, Kangwook
    PROCEEDINGS OF THE SECOND WORKSHOP ON LANGUAGE TECHNOLOGY FOR EQUALITY, DIVERSITY AND INCLUSION (LTEDI 2022), 2022, : 59 - 69
  • [23] Exploiting Syntactic Information to Boost the Fine-tuning of Pre-trained Models
    Liu, Chaoming
    Zhu, Wenhao
    Zhang, Xiaoyu
    Zhai, Qiuhong
    2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 575 - 582
  • [24] FINE-TUNING OF PRE-TRAINED END-TO-END SPEECH RECOGNITION WITH GENERATIVE ADVERSARIAL NETWORKS
    Haidar, Md Akmal
    Rezagholizadeh, Mehdi
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6204 - 6208
  • [25] Gender-tuning: Empowering Fine-tuning for Debiasing Pre-trained Language Models
    Ghanbarzadeh, Somayeh
    Huang, Yan
    Palangi, Hamid
    Moreno, Radames Cruz
    Khanpour, Hamed
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5448 - 5458
  • [26] Novel Fine-Tuning Strategy on Pre-trained Protein Model Enhances ACP Functional Type Classification
    Wang, Shaokai
    Ma, Bin
    BIOINFORMATICS RESEARCH AND APPLICATIONS, PT I, ISBRA 2024, 2024, 14954 : 371 - 382
  • [27] The Impact of Padding on Image Classification by Using Pre-trained Convolutional Neural Networks
    Tang, Hongxiang
    Ortis, Alessandro
    Battiato, Sebastiano
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2019, PT II, 2019, 11752 : 337 - 344
  • [28] Fine-tuning pre-trained networks with emphasis on image segmentation: A multi-network approach for enhanced breast cancer detection
    Ghafariasl, Parviz
    Zeinalnezhad, Masoomeh
    Chang, Shing
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 139
  • [29] Neural Architecture Search for Parameter-Efficient Fine-tuning of Large Pre-trained Language Models
    Lawton, Neal
    Kumar, Anoop
    Thattai, Govind
    Galstyan, Aram
    Ver Steeg, Greg
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8506 - 8515
  • [30] Pathologies of Pre-trained Language Models in Few-shot Fine-tuning
    Chen, Hanjie
    Zheng, Guoqing
    Awadallah, Ahmed Hassan
    Ji, Yangfeng
    PROCEEDINGS OF THE THIRD WORKSHOP ON INSIGHTS FROM NEGATIVE RESULTS IN NLP (INSIGHTS 2022), 2022, : 144 - 153