Class-Separation Preserving Pruning for Deep Neural Networks

被引:0
|
作者
Preet I. [1 ,2 ]
Boydell O. [1 ]
John D. [3 ]
机构
[1] University College Dublin, CeADAR - Ireland's Centre for Applied AI, Dublin
[2] Eaton Corporation Plc., Dublin
[3] University College Dublin, School of Electrical and Electronics Engineering, Dublin
来源
关键词
Class-separation score (CSS); deep neural networks (DNNs); pruning; structured pruning;
D O I
10.1109/TAI.2022.3228511
中图分类号
学科分类号
摘要
Neural network pruning has been deemed essential in the deployment of deep neural networks on resource-constrained edge devices, greatly reducing the number of network parameters without drastically compromising accuracy. A class of techniques proposed in the literature assigns an importance score to each parameter and prunes those of the least importance. However, most of these methods are based on generalized estimations of the importance of each parameter, ignoring the context of the specific task at hand. In this article, we propose a task specific pruning approach, CSPrune, which is based on how efficiently a neuron or a convolutional filter is able to separate classes. Our axiomatic approach assigns an importance score based on how separable different classes are in the output activations or feature maps, preserving the separation of classes which avoids the reduction in classification accuracy. Additionally, most pruning algorithms prune individual connections or weights leading to a sparse network without taking into account whether the hardware the network is deployed on can take advantage of that sparsity or not. CSPrune prunes whole neurons or filters which results in a more structured pruned network whose sparsity can be more efficiently utilized by the hardware. We evaluate our pruning method against various benchmark datasets, both small and large, and network architectures and show that our approach outperforms comparable pruning techniques. © 2020 IEEE.
引用
收藏
页码:290 / 299
页数:9
相关论文
共 50 条
  • [21] Self-distilled Pruning of Deep Neural Networks
    Neill, James O'
    Dutta, Sourav
    Assem, Haytham
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT II, 2023, 13714 : 655 - 670
  • [22] Channel Pruning for Accelerating Very Deep Neural Networks
    He, Yihui
    Zhang, Xiangyu
    Sun, Jian
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 1398 - 1406
  • [23] QLP: Deep Q-Learning for Pruning Deep Neural Networks
    Camci, Efe
    Gupta, Manas
    Wu, Min
    Lin, Jie
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6488 - 6501
  • [24] FRACTIONAL STEP DISCRIMINANT PRUNING: A FILTER PRUNING FRAMEWORK FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (ICMEW), 2020,
  • [25] TRP: Trained Rank Pruning for Efficient Deep Neural Networks
    Xu, Yuhui
    Li, Yuxi
    Zhang, Shuai
    Wen, Wei
    Wang, Botao
    Qi, Yingyong
    Chen, Yiran
    Lin, Weiyao
    Xiong, Hongkai
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 977 - 983
  • [26] Effective Search Space Pruning for Testing Deep Neural Networks
    Rangayah, Bala
    Sng, Eugene
    Trinh, Minh-Thai
    PROGRAMMING LANGUAGES AND SYSTEMS, APLAS 2024, 2025, 15194 : 365 - 387
  • [27] Pruning Deep Convolutional Neural Networks Architectures with Evolution Strategy
    Fernandes, Francisco E., Jr.
    Yen, Gary G.
    INFORMATION SCIENCES, 2021, 552 : 29 - 47
  • [28] Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?
    Liao, Zhu
    Quetu, Victor
    Nguyen, Van-Tam
    Tartaglione, Enzo
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 1394 - 1398
  • [29] Heuristic-based automatic pruning of deep neural networks
    Choudhary, Tejalal
    Mishra, Vipul
    Goswami, Anurag
    Sarangapani, Jagannathan
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (06): : 4889 - 4903
  • [30] Heuristic-based automatic pruning of deep neural networks
    Tejalal Choudhary
    Vipul Mishra
    Anurag Goswami
    Jagannathan Sarangapani
    Neural Computing and Applications, 2022, 34 : 4889 - 4903