A Population-Based Hybrid Approach for Hyperparameter Optimization of Neural Networks

被引:10
|
作者
Japa, Luis [1 ]
Serqueira, Marcello [2 ]
Mendonca, Israel [1 ]
Aritsugi, Masayoshi [3 ]
Bezerra, Eduardo [2 ]
Gonzalez, Pedro Henrique [4 ]
机构
[1] Kumamoto Univ, Grad Sch Sci & Technol, Kumamoto 8608555, Japan
[2] Fed Ctr Technol Educ Rio De Janeiro CEFET RJ, BR-20271110 Rio De Janeiro, Brazil
[3] Kumamoto Univ, Fac Adv Sci & Technol, Kumamoto 8608555, Japan
[4] Univ Fed Rio de Janeiro, Syst Engn & Comp Sci Postgrad Program, BR-21941914 Rio De Janeiro, Brazil
关键词
Genetic algorithms; hyperparameter optimization; machine learning; KEY GENETIC ALGORITHM;
D O I
10.1109/ACCESS.2023.3277310
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Hyperparameter optimization is a fundamental part of Auto Machine Learning (AutoML) and it has been widely researched in recent years; however, it still remains as one of the main challenges in this area. Motivated by the need of faster and more accurate hyperparameter optimization algorithms we developed HyperBRKGA, a new population-based approach for hyperparameter optimization. HyperBRKGA combines the Biased Random Key Genetic Algorithm with an Exploitation Method in order to search the hyperparameter space more efficiently than other commonly used hyperparameter optimization algorithms, such as Grid Search, Random Search, CMA-ES or Bayesian Optimization. We develop and test two different alternatives for this Exploitation Method: Random Walk and Bayesian Walk. We also discuss and implement other schemes, such as a Training Data Reduction Strategy and a Diversity Control strategy, in order to further improve the efficacy of our method. We performed several computational experiments on 8 different datasets to assess the effectiveness of the proposed approach. Results showed that HyperBRKGA could find hyperparameter configurations that outperformed in terms of predictive quality the baseline methods in 6 out of 8 datasets while showing a reasonable execution time. Lastly, we conducted an ablation study and showed that the addition of every component was relevant to achieving high quality results.
引用
收藏
页码:50752 / 50768
页数:17
相关论文
共 50 条
  • [41] Static facial expression recognition using convolutional neural networks based on transfer learning and hyperparameter optimization
    Tayyip Ozcan
    Alper Basturk
    Multimedia Tools and Applications, 2020, 79 : 26587 - 26604
  • [42] A comparative study of population-based optimization algorithms for downstream river flow forecasting by a hybrid neural network model
    Chen, X. Y.
    Chau, K. W.
    Busari, A. O.
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2015, 46 : 258 - 268
  • [43] Hyperparameter Optimization for Capsule Network Based Modified Hybrid Rice Optimization Algorithm
    Ye, Zhiwei
    Fang, Ziqian
    Song, Zhina
    Sui, Haigang
    Yan, Chunyan
    Zhou, Wen
    Wang, Mingwei
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 37 (02): : 2019 - 2035
  • [44] A Combinatorial Approach to Hyperparameter Optimization
    Khadka, Krishna
    Chandrasekaran, Jaganmohan
    Lei, Yu
    Kacker, Raghu N.
    Kuhn, D. Richard
    PROCEEDINGS 2024 IEEE/ACM 3RD INTERNATIONAL CONFERENCE ON AI ENGINEERING-SOFTWARE ENGINEERING FOR AI, CAIN 2024, 2024, : 140 - 149
  • [45] Autonomous Process Model Identification using Recurrent Neural Networks and Hyperparameter Optimization
    Mercangoez, Mehmet
    Cortinovis, Andrea
    Schoenborn, Sandro
    IFAC PAPERSONLINE, 2020, 53 (02): : 11614 - 11619
  • [46] Prediction of CBR by Deep Artificial Neural Networks with Hyperparameter Optimization by Simulated Annealing
    Yabi, Crespin Prudence
    Agongbe, Setondji Wadoscky
    Tamou, Bio Cheissou Koto
    Farsangi, Ehsan Noroozinejad
    Alamou, Eric
    Gibigaye, Mohamed
    INDIAN GEOTECHNICAL JOURNAL, 2024, 54 (06) : 2318 - 2334
  • [47] Training-free hyperparameter optimization of neural networks for electronic structures in matter
    Fiedler, Lenz
    Hoffmann, Nils
    Mohammed, Parvez
    Popoola, Gabriel A.
    Yovell, Tamar
    Oles, Vladyslav
    Ellis, J. Austin
    Rajamanickam, Sivasankaran
    Cangi, Attila
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (04):
  • [48] Toward classifying small lung nodules with hyperparameter optimization of convolutional neural networks
    Lima, Lucas L.
    Ferreira Junior, Jose R.
    Oliveira, Marcelo C.
    COMPUTATIONAL INTELLIGENCE, 2021, 37 (04) : 1599 - 1618
  • [49] On hybrid quanvolutional neural networks optimization
    Ceschini, Andrea
    Carbone, Andrea
    Sebastianelli, Alessandro
    Panella, Massimo
    Le Saux, Bertrand
    QUANTUM MACHINE INTELLIGENCE, 2025, 7 (01)
  • [50] Co-evolving Recurrent Neural Networks and their Hyperparameters with Simplex Hyperparameter Optimization
    Kini, Amit Dilip
    Yadav, Swaraj Sambhaji
    Thakur, Aditya Shankar
    Awari, Akshar Bajrang
    Lyu, Zimeng
    Desell, Travis
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 1639 - 1647