A Study of Filter Duplication for CNNs Filter Pruning

被引:0
|
作者
Ikuta, Ryosuke [1 ]
Yata, Noriko [1 ]
Manabe, Yoshitsugu [1 ]
机构
[1] Chiba Univ, 1-33 Yayoicho,Inage Ku, Chiba, Chiba 2638522, Japan
来源
INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY, IWAIT 2024 | 2024年 / 13164卷
关键词
CNN; pruning; redundancy; filter duplication;
D O I
10.1117/12.3018876
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional Neural Networks (CNNs) have demonstrated great success in image recognition, but most trained models are over-parameterized, and models can be compressed with only a slight performance degradation. Pruning is one of the lightweight techniques of networks, which obtains a model with a lower computational cost of inference by removing filters selectively that do not contribute to the performance. While various methods have been proposed to identify unimportant filters, determining the number of filters to be removed at each layer without causing a significant loss of accuracy is an open problem. This paper proposes a "filter duplication" approach to reduce the accuracy degradation caused by pruning, especially in higher compression ratio ranges. Filter duplication replaces unimportant filters with critical filters in a pre-trained model based on the measured importance of each convolutional layer before pruning. In experiments using mainstream CNN models and datasets, we confirmed that filter duplication improves the accuracy of the pruned model, especially with higher compression ratios. In addition, the proposed method can reflect the structural redundancy of the network to the compression ratio of each layer, providing a more efficient compression. The results show that duplicating an appropriate number of critical filters for each layer improves the robustness of the network against pruning, and optimization of duplication methods is desirable.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] Towards efficient filter pruning via topology
    Xu, Xiaozhou
    Chen, Jun
    Su, Hongye
    Xie, Lei
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2022, 19 (03) : 639 - 649
  • [32] Filter Pruning Based on Information Capacity and Independence
    Tang, Xiaolong
    Ye, Shuo
    Shi, Yufeng
    Hu, Tianheng
    Peng, Qinmu
    You, Xinge
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [33] Automatic filter pruning algorithm for image classification
    Xue, Yifan
    Yao, Wangshu
    Peng, Siyuan
    Yao, Shiyou
    APPLIED INTELLIGENCE, 2024, 54 (01) : 216 - 230
  • [34] Filter pruning via feature map clustering
    Li, Wei
    He, Yongxing
    Zhang, Xiaoyu
    Tang, Yongchuan
    INTELLIGENT DATA ANALYSIS, 2023, 27 (04) : 911 - 933
  • [35] Automatic filter pruning algorithm for image classification
    Yifan Xue
    Wangshu Yao
    Siyuan Peng
    Shiyou Yao
    Applied Intelligence, 2024, 54 : 216 - 230
  • [36] FPFS: Filter-level pruning via distance weight measuring filter similarity
    Zhang, Wei
    Wang, Zhiming
    NEUROCOMPUTING, 2022, 512 : 40 - 51
  • [37] A-pruning: a lightweight pineapple flower counting network based on filter pruning
    Yu, Guoyan
    Cai, Ruilin
    Luo, Yingtong
    Hou, Mingxin
    Deng, Ruoling
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (02) : 2047 - 2066
  • [38] You Look Twice: GaterNet for Dynamic Filter Selection in CNNs
    Chen, Zhourong
    Li, Yang
    Bengio, Samy
    Si, Si
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 9164 - 9172
  • [39] Cluster Pruning: An Efficient Filter Pruning Method for Edge AI Vision Applications
    Gamanayake, Chinthaka
    Jayasinghe, Lahiru
    Ng, Benny Kai Kiat
    Yuen, Chau
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 802 - 816
  • [40] A-pruning: a lightweight pineapple flower counting network based on filter pruning
    Guoyan Yu
    Ruilin Cai
    Yingtong Luo
    Mingxin Hou
    Ruoling Deng
    Complex & Intelligent Systems, 2024, 10 : 2047 - 2066