Comparative Study of Fine-Tuning of Pre-Trained Convolutional Neural Networks for Diabetic Retinopathy Screening

被引:0
|
作者
Mohammadian, Saboora [1 ]
Karsaz, Ali [1 ]
Roshan, Yaser M. [2 ]
机构
[1] Khorasan Inst Higher Educ, Elect Engn Dept, Mashhad, Iran
[2] Point Pk Univ, Elect Engn Dept, Pittsburgh, PA 15222 USA
关键词
Diabetic retinopathy; convolutional neural network; deep learning; Inception model;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Diabetic retinopathy is the leading cause of blindness, engaging people in different ages. Early detection of the disease, although significantly important to control and cure it, is usually being overlooked due to the need for experienced examination. To this end, automatic diabetic retinopathy diagnostic methods are proposed to facilitate the examination process and act as the physician's helper. In this paper, automatic diagnosis of diabetic retinopathy using pre-trained convolutional neural networks is studied. Pre-trained networks are chosen to avoid the time- and resource-consuming training algorithms for designing a convolutional neural network from scratch. Each neural network is fine-tuned with the pre-processed dataset, and the fine-tuning parameters as well as the pre-trained neural networks are compared together. The result of this paper, introduces a fast approach to fine-tune pre-trained networks, by studying different tuning parameters and their effect on the overall system performance due to the specific application of diabetic retinopathy screening.
引用
收藏
页码:224 / 229
页数:6
相关论文
共 50 条
  • [21] Fine-Tuning Pre-Trained CodeBERT for Code Search in Smart Contract
    JIN Huan
    LI Qinying
    Wuhan University Journal of Natural Sciences, 2023, 28 (03) : 237 - 245
  • [22] Fine-tuning Pre-trained Models for Robustness under Noisy Labels
    Ahn, Sumyeong
    Kim, Sihyeon
    Ko, Jongwoo
    Yun, Se-Young
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 3643 - 3651
  • [23] Debiasing Pre-Trained Language Models via Efficient Fine-Tuning
    Gira, Michael
    Zhang, Ruisu
    Lee, Kangwook
    PROCEEDINGS OF THE SECOND WORKSHOP ON LANGUAGE TECHNOLOGY FOR EQUALITY, DIVERSITY AND INCLUSION (LTEDI 2022), 2022, : 59 - 69
  • [24] Exploiting Syntactic Information to Boost the Fine-tuning of Pre-trained Models
    Liu, Chaoming
    Zhu, Wenhao
    Zhang, Xiaoyu
    Zhai, Qiuhong
    2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 575 - 582
  • [25] A Comparative Study of Three Pre-trained Convolutional Neural Networks in the Detection of Violence Against Women
    Aguilar, Ivan Gaytan
    Contreras, Alejandro Aguilar
    Eleuterio, Roberto Alejo
    Lara, Erendira Rendon
    Pina, Grisel Miranda
    Gutierrez, Everardo E. Granda
    CIENCIA ERGO-SUM, 2023, 31 (02)
  • [26] A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound
    Gomez-Flores, Wilfrido
    de Albuquerque Pereira, Wagner Coelho
    COMPUTERS IN BIOLOGY AND MEDICINE, 2020, 126 (126)
  • [27] FINE-TUNING OF PRE-TRAINED END-TO-END SPEECH RECOGNITION WITH GENERATIVE ADVERSARIAL NETWORKS
    Haidar, Md Akmal
    Rezagholizadeh, Mehdi
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6204 - 6208
  • [28] Towards Efficient Fine-Tuning of Pre-trained Code Models: An Experimental Study and Beyond
    Shi, Ensheng
    Wang, Yanlin
    Zhang, Hongyu
    Du, Lun
    Han, Shi
    Zhang, Dongmei
    Sun, Hongbin
    PROCEEDINGS OF THE 32ND ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2023, 2023, : 39 - 51
  • [29] Gender-tuning: Empowering Fine-tuning for Debiasing Pre-trained Language Models
    Ghanbarzadeh, Somayeh
    Huang, Yan
    Palangi, Hamid
    Moreno, Radames Cruz
    Khanpour, Hamed
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5448 - 5458
  • [30] Fine-tuning Convolutional Neural Networks for fine art classification
    Cetinic, Eva
    Lipic, Tomislav
    Grgic, Sonja
    EXPERT SYSTEMS WITH APPLICATIONS, 2018, 114 : 107 - 118