Pruning neural networks with distribution estimation algorithms

被引:0
|
作者
Cantú-Paz, E [1 ]
机构
[1] Lawrence Livermore Natl Lab, Ctr Appl Sci Comp, Livermore, CA 94551 USA
关键词
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper describes the application of four evolutionary algorithms to the pruning of neural networks used in classification problems. Besides of a simple genetic algorithm (GA), the paper considers three distribution estimation algorithms (DEAs): a compact CA, an extended compact CA, and the Bayesian Optimization Algorithm. The objective is to determine if the DEAs present advantages over the simple CA in tern-is of accuracy or speed in this problem. The experiments considered a feedforward neural network trained with standard backpropagation and 15 public-domain and artificial data sets. In most cases, the pruned networks seemed to have better or. e qual accuracy than the original fully-connected networks. We found few differences in the accuracy of the networks pruned by the four EAs, but found large differences in the execution time. The results suggest that a simple CA with a small population might be the best algorithm for pruning networks on the data sets we tested.
引用
收藏
页码:790 / 800
页数:11
相关论文
共 50 条
  • [41] On rule pruning using fuzzy neural networks
    Department of Computer Science, Regional Engineering College, Durgapur, W.B., India
    Fuzzy Sets Syst, 3 (335-347):
  • [42] Flattening Layer Pruning in Convolutional Neural Networks
    Jeczmionek, Ernest
    Kowalski, Piotr A.
    SYMMETRY-BASEL, 2021, 13 (07):
  • [43] Magnitude and Uncertainty Pruning Criterion for Neural Networks
    Ko, Vinnie
    Oehmcke, Stefan
    Gieseke, Fabian
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 2317 - 2326
  • [44] Improving artificial neural networks with a pruning methodology and genetic algorithms for their application in microbial growth prediction in food
    García-Gimeno, RM
    Hervás-Martínez, C
    de Silóniz, MI
    INTERNATIONAL JOURNAL OF FOOD MICROBIOLOGY, 2002, 72 (1-2) : 19 - 30
  • [45] DyPrune: Dynamic Pruning Rates for Neural Networks
    Aires Jonker, Richard Adolph
    Poudel, Roshan
    Fajarda, Olga
    Oliveira, Jose Luis
    Lopes, Rui Pedro
    Matos, Sergio
    PROGRESS IN ARTIFICIAL INTELLIGENCE, EPIA 2023, PT I, 2023, 14115 : 146 - 157
  • [46] Activation-Based Pruning of Neural Networks
    Ganguli, Tushar
    Chong, Edwin K. P.
    Werner, Frank
    ALGORITHMS, 2024, 17 (01)
  • [47] Sparse optimization guided pruning for neural networks
    Shi, Yong
    Tang, Anda
    Niu, Lingfeng
    Zhou, Ruizhi
    NEUROCOMPUTING, 2024, 574
  • [48] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [49] On rule pruning using fuzzy neural networks
    Pal, NR
    Pal, T
    FUZZY SETS AND SYSTEMS, 1999, 106 (03) : 335 - 347
  • [50] Structured pruning of neural networks for constraints learning
    Cacciola, Matteo
    Frangioni, Antonio
    Lodi, Andrea
    OPERATIONS RESEARCH LETTERS, 2024, 57