Efficient Hyperparameter Optimization for Convolution Neural Networks in Deep Learning: A Distributed Particle Swarm Optimization Approach

被引:55
|
作者
Guo, Yu [1 ]
Li, Jian-Yu [1 ]
Zhan, Zhi-Hui [1 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
关键词
Convolution neural network (CNN); deep learning; distributed particle swarm optimization algorithm (DPSO); hyperparameter; particle swarm optimization (PSO); ALGORITHM;
D O I
10.1080/01969722.2020.1827797
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Convolution neural network (CNN) is a kind of powerful and efficient deep learning approach that has obtained great success in many real-world applications. However, due to its complex network structure, the intertwining of hyperparameters, and the time-consuming procedure for network training, finding an efficient network configuration for CNN is a challenging yet tough work. To efficiently solve the hyperparameters setting problem, this paper proposes a distributed particle swarm optimization (DPSO) approach, which can optimize the hyperparameters to find high-performing CNNs. Compared to tedious, historical-experience-based, and personal-preference-based manual designs, the proposed DPSO approach can evolve the hyperparameters automatically and globally to obtain promising CNNs, which provides a new idea and approach for finding the global optimal hyperparameter combination. Moreover, by cooperating with the distributed computing techniques, the DPSO approach can have a considerable speedup when compared with the traditional particle swarm optimization (PSO) algorithm. Extensive experiments on widely-used image classification benchmarks have verified that the proposed DPSO approach can effectively find the CNN model with promising performance, and at the same time, has greatly reduced the computational time when compared with traditional PSO.
引用
收藏
页码:36 / 57
页数:22
相关论文
共 50 条
  • [1] Particle Swarm Optimization for Deep learning of Convolution Neural Network
    Khalifa, Mujahid H.
    Ammar, Marwa
    Ouarda, Wael
    Alimi, Adel M.
    PROCEEDINGS OF 2017 SUDAN CONFERENCE ON COMPUTER SCIENCE AND INFORMATION TECHNOLOGY (SCCSIT), 2017, : 73 - 77
  • [2] Neural network hyperparameter optimization based on improved particle swarm optimization①
    Xie X.
    He W.
    Zhu Y.
    Yu J.
    High Technology Letters, 2023, 29 (04) : 427 - 433
  • [3] Neural network hyperparameter optimization based on improved particle swarm optimization
    谢晓燕
    HE Wanqi
    ZHU Yun
    YU Jinhao
    High Technology Letters, 2023, 29 (04) : 427 - 433
  • [4] Efficient Optimization of Convolutional Neural Networks using Particle Swarm Optimization
    Yamasaki, Toshihiko
    Honma, Takuto
    Aizawa, Kiyoharu
    2017 IEEE THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2017), 2017, : 70 - 73
  • [5] Particle swarm optimization of deep neural networks architectures for image classification
    Fernandes Junior, Francisco Erivaldo
    Yen, Gary G.
    SWARM AND EVOLUTIONARY COMPUTATION, 2019, 49 : 62 - 74
  • [6] Integration of Bayesian optimization into hyperparameter tuning of the particle swarm optimization algorithm to enhance neural networks in bearing failure classification
    Soares, Ricardo Cardoso
    Silva, Julio Cesar
    de Lucena Junior, Jose Anselmo
    Lima Filho, Abel Cavalcante
    Ramos, Jorge Gabriel Gomes de Souza
    Brito, Alisson, V
    MEASUREMENT, 2025, 242
  • [7] PSO-PS:Parameter Synchronization with Particle Swarm Optimization for Distributed Training of Deep Neural Networks
    Ye, Qing
    Han, Yuxuan
    Sun, Yanan
    Lv, Jiancheng
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [8] Particle Swarm Optimization based RBF Neural Networks Learning Algorithm
    Kang, Qi
    An, Jing
    Yang, Dongsheng
    Wang, Lei
    Wu, Qidi
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 605 - +
  • [9] Particle Swarm Optimization Based Learning Method for Process Neural Networks
    Liu, Kun
    Tan, Ying
    He, Xingui
    ADVANCES IN NEURAL NETWORKS - ISNN 2010, PT 1, PROCEEDINGS, 2010, 6063 : 280 - 287
  • [10] Efficient hyperparameter optimization with Probability-based Resource Allocating on deep neural networks
    Li, Wenguo
    Yin, Xudong
    Ye, Mudan
    Zhu, Pengxu
    Li, Jinghua
    Yang, Yao
    NEUROCOMPUTING, 2024, 599