A Population-Based Hybrid Approach for Hyperparameter Optimization of Neural Networks

被引:10
|
作者
Japa, Luis [1 ]
Serqueira, Marcello [2 ]
Mendonca, Israel [1 ]
Aritsugi, Masayoshi [3 ]
Bezerra, Eduardo [2 ]
Gonzalez, Pedro Henrique [4 ]
机构
[1] Kumamoto Univ, Grad Sch Sci & Technol, Kumamoto 8608555, Japan
[2] Fed Ctr Technol Educ Rio De Janeiro CEFET RJ, BR-20271110 Rio De Janeiro, Brazil
[3] Kumamoto Univ, Fac Adv Sci & Technol, Kumamoto 8608555, Japan
[4] Univ Fed Rio de Janeiro, Syst Engn & Comp Sci Postgrad Program, BR-21941914 Rio De Janeiro, Brazil
关键词
Genetic algorithms; hyperparameter optimization; machine learning; KEY GENETIC ALGORITHM;
D O I
10.1109/ACCESS.2023.3277310
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Hyperparameter optimization is a fundamental part of Auto Machine Learning (AutoML) and it has been widely researched in recent years; however, it still remains as one of the main challenges in this area. Motivated by the need of faster and more accurate hyperparameter optimization algorithms we developed HyperBRKGA, a new population-based approach for hyperparameter optimization. HyperBRKGA combines the Biased Random Key Genetic Algorithm with an Exploitation Method in order to search the hyperparameter space more efficiently than other commonly used hyperparameter optimization algorithms, such as Grid Search, Random Search, CMA-ES or Bayesian Optimization. We develop and test two different alternatives for this Exploitation Method: Random Walk and Bayesian Walk. We also discuss and implement other schemes, such as a Training Data Reduction Strategy and a Diversity Control strategy, in order to further improve the efficacy of our method. We performed several computational experiments on 8 different datasets to assess the effectiveness of the proposed approach. Results showed that HyperBRKGA could find hyperparameter configurations that outperformed in terms of predictive quality the baseline methods in 6 out of 8 datasets while showing a reasonable execution time. Lastly, we conducted an ablation study and showed that the addition of every component was relevant to achieving high quality results.
引用
收藏
页码:50752 / 50768
页数:17
相关论文
共 50 条
  • [21] Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks
    Hinz, Tobias
    Navarro-Guerrero, Nicolas
    Magg, Sven
    Wermter, Stefan
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2018, 17 (02)
  • [22] Efficient hyperparameter optimization with Probability-based Resource Allocating on deep neural networks
    Li, Wenguo
    Yin, Xudong
    Ye, Mudan
    Zhu, Pengxu
    Li, Jinghua
    Yang, Yao
    NEUROCOMPUTING, 2024, 599
  • [23] Prediction of concrete compressive strength using deep neural networks based on hyperparameter optimization
    Naved, Mohammed
    Asim, Mohammed
    Ahmad, Tanvir
    COGENT ENGINEERING, 2024, 11 (01):
  • [24] Population-based evolutionary search for joint hyperparameter and architecture optimization in brain-computer interface
    Shin, Dong-Hee
    Lee, Deok-Joong
    Han, Ji-Wung
    Son, Young-Han
    Kam, Tae-Eui
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 264
  • [25] Genetic Algorithm-Based Hyperparameter Optimization for Convolutional Neural Networks in the Classification of Crop Pests
    Ayan, Enes
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2024, 49 (03) : 3079 - 3093
  • [26] Genetic Algorithm-Based Hyperparameter Optimization for Convolutional Neural Networks in the Classification of Crop Pests
    Enes Ayan
    Arabian Journal for Science and Engineering, 2024, 49 : 3079 - 3093
  • [27] Neural network hyperparameter optimization based on improved particle swarm optimization①
    Xie X.
    He W.
    Zhu Y.
    Yu J.
    High Technology Letters, 2023, 29 (04) : 427 - 433
  • [28] Junk-neuron-deletion strategy for hyperparameter optimization of neural networks
    Huang Ying
    Gu Chang-Gui
    Yang Hui-Jie
    ACTA PHYSICA SINICA, 2022, 71 (16)
  • [29] On the relevance of hyperparameter optimization of convolutional neural networks for retrieving spectral signals
    Saghi, Ali
    Lensu, Lasse
    Vartiainen, Erik M.
    OPTICS CONTINUUM, 2024, 3 (08): : 1461 - 1474
  • [30] Hyperparameter Optimization for Convolutional Neural Networks using the Salp Swarm Algorithm
    Abdulsaed E.H.
    Alabbas M.
    Khudeyer R.S.
    Informatica (Slovenia), 2023, 47 (09): : 133 - 144