Comparative Study of Fine-Tuning of Pre-Trained Convolutional Neural Networks for Diabetic Retinopathy Screening

被引:0
|
作者
Mohammadian, Saboora [1 ]
Karsaz, Ali [1 ]
Roshan, Yaser M. [2 ]
机构
[1] Khorasan Inst Higher Educ, Elect Engn Dept, Mashhad, Iran
[2] Point Pk Univ, Elect Engn Dept, Pittsburgh, PA 15222 USA
关键词
Diabetic retinopathy; convolutional neural network; deep learning; Inception model;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Diabetic retinopathy is the leading cause of blindness, engaging people in different ages. Early detection of the disease, although significantly important to control and cure it, is usually being overlooked due to the need for experienced examination. To this end, automatic diabetic retinopathy diagnostic methods are proposed to facilitate the examination process and act as the physician's helper. In this paper, automatic diagnosis of diabetic retinopathy using pre-trained convolutional neural networks is studied. Pre-trained networks are chosen to avoid the time- and resource-consuming training algorithms for designing a convolutional neural network from scratch. Each neural network is fine-tuned with the pre-processed dataset, and the fine-tuning parameters as well as the pre-trained neural networks are compared together. The result of this paper, introduces a fast approach to fine-tune pre-trained networks, by studying different tuning parameters and their effect on the overall system performance due to the specific application of diabetic retinopathy screening.
引用
收藏
页码:224 / 229
页数:6
相关论文
共 50 条
  • [31] Neural Architecture Search for Parameter-Efficient Fine-tuning of Large Pre-trained Language Models
    Lawton, Neal
    Kumar, Anoop
    Thattai, Govind
    Galstyan, Aram
    Ver Steeg, Greg
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8506 - 8515
  • [32] Pathologies of Pre-trained Language Models in Few-shot Fine-tuning
    Chen, Hanjie
    Zheng, Guoqing
    Awadallah, Ahmed Hassan
    Ji, Yangfeng
    PROCEEDINGS OF THE THIRD WORKSHOP ON INSIGHTS FROM NEGATIVE RESULTS IN NLP (INSIGHTS 2022), 2022, : 144 - 153
  • [33] Revisiting k-NN for Fine-Tuning Pre-trained Language Models
    Li, Lei
    Chen, Jing
    Tian, Botzhong
    Zhang, Ningyu
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2023, 2023, 14232 : 327 - 338
  • [34] Fine-tuning the hyperparameters of pre-trained models for solving multiclass classification problems
    Kaibassova, D.
    Nurtay, M.
    Tau, A.
    Kissina, M.
    COMPUTER OPTICS, 2022, 46 (06) : 971 - 979
  • [35] Improving Pre-Trained Weights through Meta-Heuristics Fine-Tuning
    de Rosa, Gustavo H.
    Roder, Mateus
    Papa, Joao Paulo
    dos Santos, Claudio F. G.
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [36] Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for Improved Generalization
    Xie, Sang Michael
    Ma, Tengyu
    Liang, Percy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [37] Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively
    Zhang, Haojie
    Li, Ge
    Li, Jia
    Zhang, Zhongjin
    Zhu, Yuqi
    Jin, Zhi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [38] An Empirical Study of Parameter-Efficient Fine-Tuning Methods for Pre-trained Code Models
    Liu, Jiaxing
    Sha, Chaofeng
    Peng, Xin
    2023 38TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE, 2023, : 397 - 408
  • [39] Pre-trained Convolutional Neural Networks for the Lung Sounds Classification
    Vaityshyn, Valentyn
    Porieva, Hanna
    Makarenkova, Anastasiia
    2019 IEEE 39TH INTERNATIONAL CONFERENCE ON ELECTRONICS AND NANOTECHNOLOGY (ELNANO), 2019, : 522 - 525
  • [40] Fine-tuning Convolutional Neural Networks: a comprehensive guide and benchmark analysis for Glaucoma Screening
    Mvoulana, Amed
    Kachouri, Rostom
    Akil, Mohamed
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6120 - 6127