Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks

被引:0
|
作者
Yun, Juyoung [1 ]
机构
[1] SUNY Stony Brook, Dept Comp Sci, New York, NY 11794 USA
关键词
Neural Networks; Optimization; Neural Pruning;
D O I
10.1109/IJCNN60899.2024.10650301
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accuracy in pruned neural models, critical in resource-constrained scenarios. Through extensive experimentation, we demonstrate that gradient sampling significantly preserves accuracy during and after the pruning process compared to traditional optimization methods. Our study highlights the pivotal role of gradient sampling in robust learning and maintaining crucial information post substantial model simplification. The results across CIFAR-10 datasets and residual neural architectures validate the versatility and effectiveness of our approach. This work presents a promising direction for developing efficient neural networks without compromising performance, even in environments with limited computational resources.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Optimization of Graph Neural Networks with Natural Gradient Descent
    Izadi, Mohammad Rasool
    Fang, Yihao
    Stevenson, Robert
    Lin, Lizhen
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 171 - 179
  • [22] Gradient-Sensitive Optimization for Convolutional Neural Networks
    Liu, Zhipeng
    Feng, Rui
    Li, Xiuhan
    Wang, Wei
    Wu, Xiaoling
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [23] Structural optimization by gradient-based neural networks
    Iranmanesh, A
    Kaveh, A
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 1999, 46 (02) : 297 - 311
  • [24] A Multi-objective Particle Swarm Optimization for Neural Networks Pruning
    Wu, Tao
    Shi, Jiao
    Zhou, Deyun
    Lei, Yu
    Gong, Maoguo
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 570 - 577
  • [25] Pruning Deep Neural Networks with l0-constrained Optimization
    Phan, Dzung T.
    Nguyen, Lam M.
    Nguyen, Nam H.
    Kalagnanam, Jayant R.
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 1214 - 1219
  • [26] Robust graph neural networks with Dirichlet regularization and residual connection
    Yao, Kaixuan
    Du, Zijin
    Li, Ming
    Cao, Feilong
    Liang, Jiye
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (09) : 3733 - 3743
  • [27] Pruning Ratio Optimization with Layer-Wise Pruning Method for Accelerating Convolutional Neural Networks
    Kamma, Koji
    Inoue, Sarimu
    Wada, Toshikazu
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (01) : 161 - 169
  • [28] Robust design optimization with mathematical programming neural networks
    Gupta, KC
    Li, JM
    COMPUTERS & STRUCTURES, 2000, 76 (04) : 507 - 516
  • [29] Spectral Pruning for Recurrent Neural Networks
    Furuya, Takashi
    Suetake, Kazuma
    Taniguchi, Koichi
    Kusumoto, Hiroyuki
    Saiin, Ryuji
    Daimon, Tomohiro
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [30] On the use of a pruning prior for neural networks
    Goutte, C
    NEURAL NETWORKS FOR SIGNAL PROCESSING VI, 1996, : 52 - 61