Comparative Analysis of Pre-trained Deep Neural Networks for Plant Disease Classification

被引:0
|
作者
George, Romiyal [1 ]
Thuseethan, Selvarajah [2 ]
Ragel, Roshan G. [1 ]
机构
[1] Univ Peradeniya, Dept Comp Engn, Peradeniya, Sri Lanka
[2] Charles Darwin Univ, Fac Sci & Technol, Darwin, NT, Australia
关键词
Plant Disease Recognition; Deep Learning; Lightweight Networks; Pre-training; Fine-tuning;
D O I
10.1109/JCSSE61278.2024.10613633
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Plant diseases are a common and significant problem for farmers worldwide, leading to reduced productivity and economic challenges for both farmers and countries. Deep learning methods offer an efficient way to classify plant diseases at an earlier stage, enhancing the quality and quantity of agricultural products. Despite the existence of traditional and computer vision classification approaches, they frequently encounter challenges like time-consuming processes, imbalanced data, and restricted field access. This research evaluates several widely used state-of-the-art deep networks on three datasets: PlantVillage, Taiwan dataset, and Citrus Fruits and Leaves Dataset, covering diseases in apple, tomato, and citrus leaves. The evaluation results demonstrate the effective recognition of disease images by deep networks. Notably, the comparison reveals the superiority of specific networks for each dataset: DenseNet201 for PlantVillage - tomato, MobileNetV3 Large for Taiwan dataset - tomato, MobileNetV2 for PlantVillage - apple, and ResNet101 for Citrus Fruits and Leaves Dataset.
引用
收藏
页码:179 / 186
页数:8
相关论文
共 50 条
  • [31] Improving weeds identification with a repository of agricultural pre-trained deep neural networks
    Espejo-Garcia, Borja
    Mylonas, Nikolaos
    Athanasakos, Loukas
    Fountas, Spyros
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 175
  • [32] Incident detection and classification in renewable energy news using pre-trained language models on deep neural networks
    Wang, Qiqing
    Li, Cunbin
    JOURNAL OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING, 2022, 22 (01) : 57 - 76
  • [33] Tomato crop disease classification using pre-trained deep learning algorithm
    Rangarajan, Aravind Krishnaswamy
    Purushothaman, Raja
    Ramesh, Aniirudh
    INTERNATIONAL CONFERENCE ON ROBOTICS AND SMART MANUFACTURING (ROSMA2018), 2018, 133 : 1040 - 1047
  • [34] Image Hashing by Pre-Trained Deep Neural Network
    Li Pingyuan
    Zhang Dan
    Yuan Xiaoguang
    Jiang Suiping
    2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 468 - 471
  • [35] Epistemic Uncertainty Quantification For Pre-trained Neural Networks
    Wang, Hanjing
    Ji, Qiang
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 11052 - 11061
  • [36] An Efficient Method for Breast Mass Classification Using Pre-Trained Deep Convolutional Networks
    Al-Mansour, Ebtihal
    Hussain, Muhammad
    Aboalsamh, Hatim A.
    Fazal-e-Amin
    MATHEMATICS, 2022, 10 (14)
  • [37] Classification of Breast Cancer Histology Image using Ensemble of Pre-trained Neural Networks
    Chennamsetty, Sai Saketh
    Safwan, Mohammed
    Alex, Varghese
    IMAGE ANALYSIS AND RECOGNITION (ICIAR 2018), 2018, 10882 : 804 - 811
  • [38] Comparison of Pre-trained vs Domain-specific Convolutional Neural Networks for Classification of Interstitial Lung Disease
    Martinez, Jorge Bautista
    Gill, Gurman
    2019 6TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI 2019), 2019, : 991 - 994
  • [39] Classification and Analysis of Agaricus bisporus Diseases with Pre-Trained Deep Learning Models
    Albayrak, Umit
    Golcuk, Adem
    Aktas, Sinan
    Coruh, Ugur
    Tasdemir, Sakir
    Baykan, Omer Kaan
    AGRONOMY-BASEL, 2025, 15 (01):
  • [40] Gray-to-color image conversion in the classification of breast lesions on ultrasound using pre-trained deep neural networks
    Wilfrido Gómez-Flores
    Wagner Coelho de Albuquerque Pereira
    Medical & Biological Engineering & Computing, 2023, 61 : 3193 - 3207