Auto-Balanced Filter Pruning for Efficient Convolutional Neural Networks

被引:0
|
作者
Ding, Xiaohan [1 ]
Ding, Guiguang [1 ]
Han, Jungong [2 ]
Tang, Sheng [3 ]
机构
[1] Tsinghua Univ, Sch Software, Beijing 100084, Peoples R China
[2] Univ Lancaster, Sch Comp & Commun, Lancaster LA1 4YW, England
[3] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years considerable research efforts have been devoted to compression techniques of convolutional neural networks (CNNs). Many works so far have focused on CNN connection pruning methods which produce sparse parameter tensors in convolutional or fully-connected layers. It has been demonstrated in several studies that even simple methods can effectively eliminate connections of a CNN. However, since these methods make parameter tensors just sparser but no smaller, the compression may not transfer directly to acceleration without support from specially designed hardware. In this paper, we propose an iterative approach named Auto-balanced Filter Pruning, where we pre-train the network in an innovative auto-balanced way to transfer the representational capacity of its convolutional layers to a fraction of the filters, prune the redundant ones, then re-train it to restore the accuracy. In this way, a smaller version of the original network is learned and the floating-point operations (FLOPs) are reduced. By applying this method on several common CNNs, we show that a large portion of the filters can be discarded without obvious accuracy drop, leading to significant reduction of computational burdens. Concretely, we reduce the inference cost of LeNet-5 on MNIST, VGG-16 and ResNet-56 on CIFAR-10 by 95.1%, 79.7% and 60.9%, respectively.
引用
收藏
页码:6797 / 6804
页数:8
相关论文
共 50 条
  • [31] Sensitive Faraday rotation measurement with auto-balanced photodetection
    Chang, Chia-Yu
    Wang, Likarn
    Shy, Jow-Tsong
    Lin, Chu-En
    Chou, Chien
    REVIEW OF SCIENTIFIC INSTRUMENTS, 2011, 82 (06):
  • [32] Filter pruning with uniqueness mechanism in the frequency domain for efficient neural networks
    Zhang, Shuo
    Gao, Mingqi
    Ni, Qiang
    Han, Jungong
    NEUROCOMPUTING, 2023, 530 : 116 - 124
  • [33] Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks
    Mitsuno, Kakeru
    Kurita, Takio
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 1089 - 1095
  • [34] HFP: Hardware-Aware Filter Pruning for Deep Convolutional Neural Networks Acceleration
    Yu, Fang
    Han, Chuanqi
    Wang, Pengcheng
    Huang, Ruoran
    Huang, Xi
    Cui, Li
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 255 - 262
  • [35] CAPTOR: A Class Adaptive Filter Pruning Framework for Convolutional Neural Networks in Mobile Applications
    Qin, Zhuwei
    Yu, Fuxun
    Liu, Chenchen
    Chen, Xiang
    24TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE (ASP-DAC 2019), 2019, : 444 - 449
  • [36] Incremental Filter Pruning via Random Walk for Accelerating Deep Convolutional Neural Networks
    Li, Qinghua
    Li, Cuiping
    Chen, Hong
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 358 - 366
  • [37] Batch-Normalization-based Soft Filter Pruning for Deep Convolutional Neural Networks
    Xu, Xiaozhou
    Chen, Qiming
    Xie, Lei
    Su, Hongye
    16TH IEEE INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2020), 2020, : 951 - 956
  • [38] Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks
    You, Zhonghui
    Yan, Kun
    Ye, Jinmian
    Ma, Meng
    Wang, Ping
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [39] DPFPS: Dynamic and Progressive Filter Pruning for Compressing Convolutional Neural Networks from Scratch
    Ruan, Xiaofeng
    Liu, Yufan
    Li, Bing
    Yuan, Chunfeng
    Hu, Weiming
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 2495 - 2503
  • [40] DyFiP: Explainable AI-based Dynamic Filter Pruning of Convolutional Neural Networks
    Sabih, Muhammad
    Hannig, Frank
    Teich, Juergen
    PROCEEDINGS OF THE 2022 2ND EUROPEAN WORKSHOP ON MACHINE LEARNING AND SYSTEMS (EUROMLSYS '22), 2022, : 109 - 115