Class-Separation Preserving Pruning for Deep Neural Networks

被引:0
|
作者
Preet I. [1 ,2 ]
Boydell O. [1 ]
John D. [3 ]
机构
[1] University College Dublin, CeADAR - Ireland's Centre for Applied AI, Dublin
[2] Eaton Corporation Plc., Dublin
[3] University College Dublin, School of Electrical and Electronics Engineering, Dublin
来源
关键词
Class-separation score (CSS); deep neural networks (DNNs); pruning; structured pruning;
D O I
10.1109/TAI.2022.3228511
中图分类号
学科分类号
摘要
Neural network pruning has been deemed essential in the deployment of deep neural networks on resource-constrained edge devices, greatly reducing the number of network parameters without drastically compromising accuracy. A class of techniques proposed in the literature assigns an importance score to each parameter and prunes those of the least importance. However, most of these methods are based on generalized estimations of the importance of each parameter, ignoring the context of the specific task at hand. In this article, we propose a task specific pruning approach, CSPrune, which is based on how efficiently a neuron or a convolutional filter is able to separate classes. Our axiomatic approach assigns an importance score based on how separable different classes are in the output activations or feature maps, preserving the separation of classes which avoids the reduction in classification accuracy. Additionally, most pruning algorithms prune individual connections or weights leading to a sparse network without taking into account whether the hardware the network is deployed on can take advantage of that sparsity or not. CSPrune prunes whole neurons or filters which results in a more structured pruned network whose sparsity can be more efficiently utilized by the hardware. We evaluate our pruning method against various benchmark datasets, both small and large, and network architectures and show that our approach outperforms comparable pruning techniques. © 2020 IEEE.
引用
收藏
页码:290 / 299
页数:9
相关论文
共 50 条
  • [1] Class-dependent Pruning of Deep Neural Networks
    Entezari, Rahim
    Saukh, Olga
    2020 IEEE SECOND WORKSHOP ON MACHINE LEARNING ON EDGE IN SENSOR SYSTEMS (SENSYS-ML 2020), 2020, : 13 - 18
  • [2] PruNet: Class-Blind Pruning Method for Deep Neural Networks
    Marchisio, Alberto
    Hanif, Muhammad Abdullah
    Martina, Maurizio
    Shafique, Muhammad
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [3] DeepVD: Toward Class-Separation Features for Neural Network Vulnerability Detection
    Wang, Wenbo
    Nguyen, Tien N.
    Wang, Shaohua
    Li, Yi
    Zhang, Jiyuan
    Yadavally, Aashish
    2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 2249 - 2261
  • [4] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    IEEE ACCESS, 2022, 10 : 63280 - 63300
  • [5] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [6] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [7] Fast Convex Pruning of Deep Neural Networks
    Aghasi, Alireza
    Abdi, Afshin
    Romberg, Justin
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 158 - 188
  • [8] Automatic Pruning Rate Derivation for Structured Pruning of Deep Neural Networks
    Sakai, Yasufumi
    Iwakawa, Akinori
    Tabaru, Tsuguchika
    Inoue, Atsuki
    Kawaguchi, Hiroshi
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2561 - 2567
  • [9] DEEP LEARNING BASED METHOD FOR PRUNING DEEP NEURAL NETWORKS
    Li, Lianqiang
    Zhu, Jie
    Sun, Ming-Ting
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 312 - 317
  • [10] Class-Aware Pruning for Efficient Neural Networks
    Jiang, Mengnan
    Wang, Jingcun
    Eldebiky, Amro
    Yin, Xunzhao
    Zhuo, Cheng
    Lin, Ing-Chao
    Li Zhang, Grace
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,