Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks

被引:0
|
作者
Yun, Juyoung [1 ]
机构
[1] SUNY Stony Brook, Dept Comp Sci, New York, NY 11794 USA
关键词
Neural Networks; Optimization; Neural Pruning;
D O I
10.1109/IJCNN60899.2024.10650301
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accuracy in pruned neural models, critical in resource-constrained scenarios. Through extensive experimentation, we demonstrate that gradient sampling significantly preserves accuracy during and after the pruning process compared to traditional optimization methods. Our study highlights the pivotal role of gradient sampling in robust learning and maintaining crucial information post substantial model simplification. The results across CIFAR-10 datasets and residual neural architectures validate the versatility and effectiveness of our approach. This work presents a promising direction for developing efficient neural networks without compromising performance, even in environments with limited computational resources.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Pruning product unit neural networks
    Ismail, A
    Engelbrecht, AP
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 257 - 262
  • [32] Cyclical Pruning for Sparse Neural Networks
    Srinivas, Suraj
    Kuzmin, Andrey
    Nagel, Markus
    van Baalen, Mart
    Skliar, Andrii
    Blankevoort, Tijmen
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 2761 - 2770
  • [33] Automatic Pruning for Quantized Neural Networks
    Guerra, Luis
    Drummond, Tom
    2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 290 - 297
  • [34] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    IEEE ACCESS, 2022, 10 : 63280 - 63300
  • [35] GROWING AND PRUNING NEURAL TREE NETWORKS
    SANKAR, A
    MAMMONE, RJ
    IEEE TRANSACTIONS ON COMPUTERS, 1993, 42 (03) : 291 - 299
  • [36] PRUNING VERSUS CLIPPING IN NEURAL NETWORKS
    JANOWSKY, SA
    PHYSICAL REVIEW A, 1989, 39 (12): : 6600 - 6603
  • [37] PRUNING ARTIFICIAL NEURAL NETWORKS USING NEURAL COMPLEXITY MEASURES
    Jorgensen, Thomas D.
    Haynes, Barry P.
    Norlund, Charlotte C. F.
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2008, 18 (05) : 389 - 403
  • [38] Robust Neural Network Pruning by Cooperative Coevolution
    Wu, Jia-Liang
    Shang, Haopu
    Hong, Wenjing
    Qian, Chao
    PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XVII, PPSN 2022, PT I, 2022, 13398 : 459 - 473
  • [39] Artificial intelligence classification model for macular degeneration images: a robust optimization framework for residual neural networks
    Ho, Wen-Hsien
    Huang, Tian-Hsiang
    Yang, Po-Yuan
    Chou, Jyh-Horng
    Huang, Hong-Siang
    Chi, Li-Chung
    Chou, Fu-, I
    Tsai, Jinn-Tsong
    BMC BIOINFORMATICS, 2021, 22 (SUPPL 5)
  • [40] Artificial intelligence classification model for macular degeneration images: a robust optimization framework for residual neural networks
    Wen-Hsien Ho
    Tian-Hsiang Huang
    Po-Yuan Yang
    Jyh-Horng Chou
    Hong-Siang Huang
    Li-Chung Chi
    Fu-I Chou
    Jinn-Tsong Tsai
    BMC Bioinformatics, 22