Enhanced harmony search for hyperparameter tuning of deep neural networks

被引:1
|
作者
Purnomo H.D. [1 ]
Gonsalves T. [2 ]
Wahyono T. [1 ]
Saian P.O.N. [1 ]
机构
[1] Department of Information Technology, Universitas Kristen Satya Wacana, Salatiga
[2] Department of Information and Communication Sciences, Sophia University, Tokyo
关键词
Configuration; Deep neural network; Harmony memory consideration rate; Harmony search; Rank-based selection;
D O I
10.1007/s00500-024-09840-7
中图分类号
学科分类号
摘要
The performance of a deep neural network is affected by its configuration as well as its training process. Determining the configuration of a DNN and training its parameters are challenging tasks due to high-dimensional problems. Therefore, there is a need for methods that can optimize the configuration and parameters of a DNN. Most of the existing DNN optimization research concerns the optimization of DNN parameters, and there are only a few studies discussing the optimization of DNN configuration. In this paper, enhanced harmony search is proposed to optimize the configuration of a fully connected neural network. The proposed harmony search enhancement is conducted by introducing various types of harmony memory consideration rate and various types of harmony memory selection. Four types of harmony memory consideration rate are proposed in this research: constant rate, linear increase rate, linear decrease rate, and sigmoid rate. Two types of harmony memory selection are proposed in this research: rank-based selection and random selection. The combination of types of harmony memory consideration rate and types of selection generates eight harmony search scenarios. The performance of the proposed method is compared to random search and genetic algorithm using 12 datasets of classification problems. The experiment results show that the proposed harmony search outperforms random search in 8 out of 12 problems and approximately has the same performance in 4 problems. Harmony search also outperforms genetic algorithm in five problems, approximately has the same performance in six problems, and has worse performance in one problem. In addition, combining various types of harmony memory consideration rate and rank-based selection increases the performance of the ordinary harmony search. The combination of harmony memory consideration with linear increase rate and rank-based selection performs the best among all combinations. It is better than the ordinary harmony search in seven problems, approximately equal in three problems, and worse in two problems. The results show that the proposed method has some advantages in solving classification problems using a DNN. First, the configuration of the DNN is represented as an optimization problem so that it can be used to find a specific FCNN configuration that is suitable for a specific problem. Second, the approach is a global optimization approach as it tunes the DNN hyperparameter (configuration) as well as the DNN parameter (connection weight). Therefore, it is able to find the best combination of DNN configuration as well as its connection weight. However, there is a need to develop a strategy to balance the hyperparameter tuning and the parameter tuning. Inappropriate balance could lead to a high computational cost. Future research can be directed to balance the hyperparameter and parameter tuning during the solution search. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024.
引用
收藏
页码:9905 / 9919
页数:14
相关论文
共 50 条
  • [41] AN ALGORITHM FOR NUMERICAL SOLUTION OF DIFFERENTIAL EQUATIONS USING HARMONY SEARCH AND NEURAL NETWORKS
    Yadav, Neha
    Thi Thuy Ngo
    Kim, Joong Hoon
    JOURNAL OF APPLIED ANALYSIS AND COMPUTATION, 2022, 12 (04): : 1277 - 1293
  • [42] Transfer Learning based Search Space Design for Hyperparameter Tuning
    Li, Yang
    Shen, Yu
    Jiang, Huaijun
    Bai, Tianyi
    Zhang, Wentao
    Zhang, Ce
    Cui, Bin
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 967 - 977
  • [43] Modified Harmony Search Algorithm and Neural Networks for Concrete Mix Proportion Design
    Lee, Joo-Ha
    Yoon, Young-Soo
    JOURNAL OF COMPUTING IN CIVIL ENGINEERING, 2009, 23 (01) : 57 - 61
  • [44] Deep neural networks for gas concentration estimation and the effect of hyperparameter optimization on the estimation performance
    Jang, Hee-Deok
    Park, Jae-Hyeon
    Nam, Hyunwoo
    Chang, Dong Eui
    2022 22ND INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2022), 2022, : 15 - 19
  • [45] Automatic Optimization of Hyperparameters for Deep Convolutional Neural Networks: Grid Search Enhanced with Coordinate Ascent
    Song, Qingqing
    Xia, Shaoliang
    Wu, Zhen
    PROCEEDINGS OF 2024 INTERNATIONAL CONFERENCE ON MACHINE INTELLIGENCE AND DIGITAL APPLICATIONS, MIDA2024, 2024, : 300 - 306
  • [46] Neural Architecture Search Using Deep Neural Networks and Monte Carlo Tree Search
    Wang, Linnan
    Zhao, Yiyang
    Yuu Jinnai
    Tian, Yuandong
    Fonseca, Rodrigo
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9983 - 9991
  • [47] Deep Neural Network with Hyperparameter Tuning on Early Detection for Symptom Recognition in Suspected Covid-19
    Djuniadi, Djuniadi
    Iksan, Nur
    Suni, Alfa Faridh
    Hastawan, Ahmad Fashiha
    TEM JOURNAL-TECHNOLOGY EDUCATION MANAGEMENT INFORMATICS, 2023, 12 (02): : 1000 - 1007
  • [48] DeepQGHO: Quantized Greedy Hyperparameter Optimization in Deep Neural Networks for on-the-Fly Learning
    Chowdhury, Anjir Ahmed
    Hossen, Md Abir
    Azam, Md Ali
    Rahman, Md Hafizur
    IEEE ACCESS, 2022, 10 : 6407 - 6416
  • [49] Efficient hyperparameter optimization with Probability-based Resource Allocating on deep neural networks
    Li, Wenguo
    Yin, Xudong
    Ye, Mudan
    Zhu, Pengxu
    Li, Jinghua
    Yang, Yao
    NEUROCOMPUTING, 2024, 599
  • [50] Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves
    Domhan, Tobias
    Springenberg, Jost Tobias
    Hutter, Frank
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3460 - 3468