A Study of Filter Duplication for CNNs Filter Pruning

被引:0
|
作者
Ikuta, Ryosuke [1 ]
Yata, Noriko [1 ]
Manabe, Yoshitsugu [1 ]
机构
[1] Chiba Univ, 1-33 Yayoicho,Inage Ku, Chiba, Chiba 2638522, Japan
来源
INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY, IWAIT 2024 | 2024年 / 13164卷
关键词
CNN; pruning; redundancy; filter duplication;
D O I
10.1117/12.3018876
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional Neural Networks (CNNs) have demonstrated great success in image recognition, but most trained models are over-parameterized, and models can be compressed with only a slight performance degradation. Pruning is one of the lightweight techniques of networks, which obtains a model with a lower computational cost of inference by removing filters selectively that do not contribute to the performance. While various methods have been proposed to identify unimportant filters, determining the number of filters to be removed at each layer without causing a significant loss of accuracy is an open problem. This paper proposes a "filter duplication" approach to reduce the accuracy degradation caused by pruning, especially in higher compression ratio ranges. Filter duplication replaces unimportant filters with critical filters in a pre-trained model based on the measured importance of each convolutional layer before pruning. In experiments using mainstream CNN models and datasets, we confirmed that filter duplication improves the accuracy of the pruned model, especially with higher compression ratios. In addition, the proposed method can reflect the structural redundancy of the network to the compression ratio of each layer, providing a more efficient compression. The results show that duplicating an appropriate number of critical filters for each layer improves the robustness of the network against pruning, and optimization of duplication methods is desirable.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Hardware-Aware Evolutionary Filter Pruning
    Heidorn, Christian
    Meyerhoefer, Nicolai
    Schinabeck, Christian
    Hannig, Frank
    Teich, Juergen
    EMBEDDED COMPUTER SYSTEMS: ARCHITECTURES, MODELING, AND SIMULATION, SAMOS 2022, 2022, 13511 : 283 - 299
  • [22] Edge devices object detection by filter pruning
    Crescitelli, Viviana
    Miura, Seiji
    Ono, Goichi
    Kohmu, Naohiro
    2021 26TH IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA), 2021,
  • [23] Filter Pruning Without Damaging Networks Capacity
    Zuo, Yuding
    Chen, Bo
    Shi, Te
    Sun, Mengfan
    IEEE ACCESS, 2020, 8 (90924-90930) : 90924 - 90930
  • [24] BALANCED STRIPE-WISE PRUNING IN THE FILTER
    Huo, Zheng
    Wang, Chong
    Chen, Weiwei
    Li, Yuqi
    Wang, Jun
    Wu, Jiafei
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4408 - 4412
  • [25] Convolution Acceleration: Query Based Filter Pruning
    Feeney, Arthur
    Zhang, Yu
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 1645 - 1652
  • [26] Adaptive Filter Pruning via Sensitivity Feedback
    Zhang, Yuyao
    Freris, Nikolaos M.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 10996 - 11008
  • [27] ONLINE FILTER CLUSTERING AND PRUNING FOR EFFICIENT CONVNETS
    Zhou, Zhengguang
    Zhou, Wengang
    Hong, Richang
    Li, Houqiang
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 11 - 15
  • [28] Towards efficient filter pruning via topology
    Xiaozhou Xu
    Jun Chen
    Hongye Su
    Lei Xie
    Journal of Real-Time Image Processing, 2022, 19 : 639 - 649
  • [29] Feature Statistics Guided Efficient Filter Pruning
    Li, Hang
    Ma, Chen
    Xu, Wei
    Liu, Xue
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 2619 - 2625
  • [30] Pruning CNN's with Linear Filter Ensembles
    Sandor, Csanad
    Pavel, Szabolcs
    Csato, Lehel
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1435 - 1442