Parallel Gradient-Based Local Search Accelerating Particle Swarm Optimization for Training Microwave Neural Network Models

被引:0
|
作者
Zhang, Jianan [1 ]
Ma, Kai [1 ]
Feng, Feng [1 ,2 ]
Zhang, Qijun [1 ,2 ]
机构
[1] Tianjin Univ, Sch Elect Informat Engn, Tianjin 300072, Peoples R China
[2] Carleton Univ, Dept Elect, Ottawa, ON K1S 5B6, Canada
关键词
Parallel; particle swarm optimization; neural networks; microwave modeling; message passing interface (MPI);
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper presents a novel global optimization technique for training microwave neural network models. Unlike existing sequential hybrid algorithms, the proposed technique implements parallel gradient-based local search in particle swarm optimization (PSO). The whole swarm is divided into subswarms for multiple processors. The particle with the lowest error in the subswarm in each processor is chosen to do further local search using quasi-Newton method. This process is performed in all the subswarms in parallel using the message passing interface (MPI). The proposed technique increases the probability and speed of finding a global optimum. This technique is illustrated by two microwave modeling examples.
引用
收藏
页数:3
相关论文
共 50 条
  • [41] The algorithms optimization of artificial neural network based on particle swarm
    Yang, Xin-Quan, 1600, Bentham Science Publishers B.V., P.O. Box 294, Bussum, 1400 AG, Netherlands (08):
  • [42] The Research of Fuzzy Neural Network Based on Particle Swarm Optimization
    Man Chun-tao
    Zhang Cai-yun
    Zhang Lu-qi
    Liu Qing-yu
    PROCEEDINGS OF 2013 2ND INTERNATIONAL CONFERENCE ON MEASUREMENT, INFORMATION AND CONTROL (ICMIC 2013), VOLS 1 & 2, 2013, : 1122 - 1125
  • [43] A hybrid of artificial fish swarm algorithm and particle swarm optimization for feedforward neural network training
    Chen, Huadong
    Wang, Shuzong
    Li, Jingxi
    Li, Yunfan
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS AND KNOWLEDGE ENGINEERING (ISKE 2007), 2007,
  • [44] Comparative Analysis of Performances of an Improved Particle Swarm Optimization and a Traditional Particle Swarm Optimization for Training of Neural Network Architecture Space
    Comak, Emre
    Gunduz, Gurhan
    ACTA POLYTECHNICA HUNGARICA, 2025, 22 (05) : 7 - 30
  • [45] Accelerating gradient-based topology optimization design with dual-model artificial neural networks
    Qian, Chao
    Ye, Wenjing
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2021, 63 (04) : 1687 - 1707
  • [46] Accelerating gradient-based topology optimization design with dual-model artificial neural networks
    Chao Qian
    Wenjing Ye
    Structural and Multidisciplinary Optimization, 2021, 63 : 1687 - 1707
  • [47] A diversity-guided hybrid particle swarm optimization based on gradient search
    Han, Fei
    Liu, Qing
    NEUROCOMPUTING, 2014, 137 : 234 - 240
  • [48] A self-adaptive gradient-based particle swarm optimization algorithm with dynamic population topology ?
    Zhang, Daren
    Ma, Gang
    Deng, Zhuoran
    Wang, Qiao
    Zhang, Guike
    Zhou, Wei
    APPLIED SOFT COMPUTING, 2022, 130
  • [49] Neural Network Training by Hybrid Accelerated Cuckoo Particle Swarm Optimization Algorithm
    Nawi, Nazri Mohd
    Khan, Abdullah
    Rehman, M. Z.
    Aziz, Maslina Abdul
    Herawan, Tutut
    Abawajy, Jemal H.
    NEURAL INFORMATION PROCESSING (ICONIP 2014), PT II, 2014, 8835 : 237 - 244
  • [50] Comparison of Particle Swarm Optimization and Backpropagation Algorithms for Training Feedforward Neural Network
    Mohammadi, Nasser
    Mirabedini, Seyed Javad
    JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE-JMCS, 2014, 12 (02): : 113 - 123