D-Pruner: Filter-Based Pruning Method for Deep Convolutional Neural Network

被引:4
|
作者
Huynh, Loc N. [1 ]
Lee, Youngki [1 ]
Balan, Rajesh Krishna [1 ]
机构
[1] Singapore Management Univ, Singapore, Singapore
关键词
Continuous Vision; Deep Learning; Compression;
D O I
10.1145/3212725.3212730
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The emergence of augmented reality devices such as Google Glass and Microsoft Hololens has opened up a new class of vision sensing applications. Those applications often require the ability to continuously capture and analyze contextual information from video streams. They often adopt various deep learning algorithms such as convolutional neural networks (CNN) to achieve high recognition accuracy while facing severe challenges to run computationally intensive deep learning algorithms on resource-constrained mobile devices. In this paper, we propose and explore a new class of compression technique called D-Pruner to efficiently prune redundant parameters within a CNN model to run the model efficiently on mobile devices. D-Pruner removes redundancy by embedding a small additional network. This network evaluates the importance of filters and removes them during the fine-tuning phase to efficiently reduce the size of the model while maintaining the accuracy of the original model. We evaluated D-Pruner on various datasets such as CIFAR-10 and CIFAR-100 and showed that D-Pruner could reduce a significant amount of parameters up to 4.4 times on many existing models while maintaining accuracy drop less than 1%.
引用
收藏
页码:7 / 12
页数:6
相关论文
共 50 条
  • [1] A Novel Filter-Level Deep Convolutional Neural Network Pruning Method Based on Deep Reinforcement Learning
    Feng, Yihao
    Huang, Chao
    Wang, Long
    Luo, Xiong
    Li, Qingwen
    APPLIED SCIENCES-BASEL, 2022, 12 (22):
  • [2] LM Filter-Based Deep Convolutional Neural Network for Pedestrian Attribute Recognition
    Uzen, Huseyin
    Hanbay, Kazim
    JOURNAL OF POLYTECHNIC-POLITEKNIK DERGISI, 2020, 23 (03): : 605 - 613
  • [3] Extended Kalman filter-based pruning method for recurrent neural networks
    Sum, J
    Chan, LW
    Leung, CS
    Young, GH
    NEURAL COMPUTATION, 1998, 10 (06) : 1481 - 1505
  • [4] A Filter Rank Based Pruning Method for Convolutional Neural Networks
    Liu, Hao
    Guan, Zhenyu
    Lei, Peng
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1318 - 1322
  • [5] Overview of Deep Convolutional Neural Network Pruning
    Li, Guang
    Liu, Fang
    Xia, Yuping
    2020 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING AND ARTIFICIAL INTELLIGENCE, 2020, 11584
  • [6] Convolutional Neural Network Pruning Using Filter Attenuation
    Mousa-Pasandi, Morteza
    Hajabdollahi, Mohsen
    Karimi, Nader
    Samavi, Shadrokh
    Shirani, Shahram
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 2905 - 2909
  • [7] ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression
    Luo, Jian-Hao
    Wu, Jianxin
    Lin, Weiyao
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5068 - 5076
  • [8] Deep Neural Network Channel Pruning Compression Method for Filter Elasticity
    Li, Ruiquan
    Zhu, Lu
    Liu, Yuanyuan
    Computer Engineering and Applications, 2024, 60 (06) : 163 - 171
  • [9] An optimal-score-based filter pruning for deep convolutional neural networks
    Sawant, Shrutika S.
    Bauer, J.
    Erick, F. X.
    Ingaleshwar, Subodh
    Holzer, N.
    Ramming, A.
    Lang, E. W.
    Goetz, Th
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17557 - 17579
  • [10] An optimal-score-based filter pruning for deep convolutional neural networks
    Shrutika S. Sawant
    J. Bauer
    F. X. Erick
    Subodh Ingaleshwar
    N. Holzer
    A. Ramming
    E. W. Lang
    Th. Götz
    Applied Intelligence, 2022, 52 : 17557 - 17579