Comparative Study of Fine-Tuning of Pre-Trained Convolutional Neural Networks for Diabetic Retinopathy Screening

被引:0
|
作者
Mohammadian, Saboora [1 ]
Karsaz, Ali [1 ]
Roshan, Yaser M. [2 ]
机构
[1] Khorasan Inst Higher Educ, Elect Engn Dept, Mashhad, Iran
[2] Point Pk Univ, Elect Engn Dept, Pittsburgh, PA 15222 USA
关键词
Diabetic retinopathy; convolutional neural network; deep learning; Inception model;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Diabetic retinopathy is the leading cause of blindness, engaging people in different ages. Early detection of the disease, although significantly important to control and cure it, is usually being overlooked due to the need for experienced examination. To this end, automatic diabetic retinopathy diagnostic methods are proposed to facilitate the examination process and act as the physician's helper. In this paper, automatic diagnosis of diabetic retinopathy using pre-trained convolutional neural networks is studied. Pre-trained networks are chosen to avoid the time- and resource-consuming training algorithms for designing a convolutional neural network from scratch. Each neural network is fine-tuned with the pre-processed dataset, and the fine-tuning parameters as well as the pre-trained neural networks are compared together. The result of this paper, introduces a fast approach to fine-tune pre-trained networks, by studying different tuning parameters and their effect on the overall system performance due to the specific application of diabetic retinopathy screening.
引用
收藏
页码:224 / 229
页数:6
相关论文
共 50 条
  • [41] Fine-Tuning Convolutional Neural Networks Using Harmony Search
    Rosa, Gustavo
    Papa, Joao
    Marana, Aparecido
    Scheirer, Walter
    Cox, David
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2015, 2015, 9423 : 683 - 690
  • [42] Grading the severity of diabetic retinopathy using an ensemble of self-supervised pre-trained convolutional neural networks: ESSP-CNNs
    Parsa S.
    Khatibi T.
    Multimedia Tools and Applications, 2024, 83 (42) : 89837 - 89870
  • [43] Towards Fine-tuning Pre-trained Language Models with Integer Forward and Backward Propagation
    Tayaranian, Mohammadreza
    Ghaffari, Alireza
    Tahaei, Marzieh S.
    Rezagholizadeh, Mehdi
    Asgharian, Masoud
    Nia, Vahid Partovi
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1912 - 1921
  • [44] Fine-Tuning Pre-Trained Model to Extract Undesired Behaviors from App Reviews
    Zhang, Wenyu
    Wang, Xiaojuan
    Lai, Shanyan
    Ye, Chunyang
    Zhou, Hui
    2022 IEEE 22ND INTERNATIONAL CONFERENCE ON SOFTWARE QUALITY, RELIABILITY AND SECURITY, QRS, 2022, : 1125 - 1134
  • [45] Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models
    Zhou, Mingjun
    Daiqing, Zhuoma
    Qun, Nuo
    Nyima, Tashi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 410 - 422
  • [46] Empirical study on fine-tuning pre-trained large language models for fault diagnosis of complex systems
    Zheng, Shuwen
    Pan, Kai
    Liu, Jie
    Chen, Yunxia
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 252
  • [47] Analysis of Pre-trained Convolutional Neural Network Models in Diabetic Retinopathy Detection Through Retinal Fundus Images
    Escorcia-Gutierrez, Jose
    Cuello, Jose
    Barraza, Carlos
    Gamarra, Margarita
    Romero-Aroca, Pere
    Caicedo, Eduardo
    Valls, Aida
    Puig, Domenec
    COMPUTER INFORMATION SYSTEMS AND INDUSTRIAL MANAGEMENT (CISIM 2022), 2022, 13293 : 202 - 213
  • [48] Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing
    Al-Ghamdi, Sharefah
    Al-Khalifa, Hend
    Al-Salman, Abdulmalik
    APPLIED SCIENCES-BASEL, 2023, 13 (07):
  • [49] Fine-Tuning Pre-Trained Model for Consumer Fraud Detection from Consumer Reviews
    Tang, Xingli
    Li, Keqi
    Huang, Liting
    Zhou, Hui
    Ye, Chunyang
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2023, PT II, 2023, 14147 : 451 - 456
  • [50] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
    Alt, Christoph
    Huebner, Marc
    Hennig, Leonhard
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398