A Study of Filter Duplication for CNNs Filter Pruning

被引:0
|
作者
Ikuta, Ryosuke [1 ]
Yata, Noriko [1 ]
Manabe, Yoshitsugu [1 ]
机构
[1] Chiba Univ, 1-33 Yayoicho,Inage Ku, Chiba, Chiba 2638522, Japan
来源
INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY, IWAIT 2024 | 2024年 / 13164卷
关键词
CNN; pruning; redundancy; filter duplication;
D O I
10.1117/12.3018876
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional Neural Networks (CNNs) have demonstrated great success in image recognition, but most trained models are over-parameterized, and models can be compressed with only a slight performance degradation. Pruning is one of the lightweight techniques of networks, which obtains a model with a lower computational cost of inference by removing filters selectively that do not contribute to the performance. While various methods have been proposed to identify unimportant filters, determining the number of filters to be removed at each layer without causing a significant loss of accuracy is an open problem. This paper proposes a "filter duplication" approach to reduce the accuracy degradation caused by pruning, especially in higher compression ratio ranges. Filter duplication replaces unimportant filters with critical filters in a pre-trained model based on the measured importance of each convolutional layer before pruning. In experiments using mainstream CNN models and datasets, we confirmed that filter duplication improves the accuracy of the pruned model, especially with higher compression ratios. In addition, the proposed method can reflect the structural redundancy of the network to the compression ratio of each layer, providing a more efficient compression. The results show that duplicating an appropriate number of critical filters for each layer improves the robustness of the network against pruning, and optimization of duplication methods is desirable.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] A novel technique for optimizing the filter size of CNNs without backpropagation
    Maqbool, Muhammad Manzar
    Took, Clive Cheong
    Sanei, Saeid
    2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 354 - 358
  • [42] Robust pruning for efficient CNNs A
    Ide, Hidenori
    Kobayashi, Takumi
    Watanabe, Kenji
    Kurita, Takio
    PATTERN RECOGNITION LETTERS, 2020, 135 : 90 - 98
  • [43] Progressive Local Filter Pruning for Image Retrieval Acceleration
    Wang, Xiaodong
    Zheng, Zhedong
    He, Yang
    Yan, Fei
    Zeng, Zhiqiang
    Yang, Yi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 9597 - 9607
  • [44] Efficient tensor decomposition-based filter pruning
    Pham, Van Tien
    Zniyed, Yassine
    Nguyen, Thanh Phuong
    NEURAL NETWORKS, 2024, 178
  • [45] Soft and Hard Filter Pruning via Dimension Reduction
    Cai, Linhang
    An, Zhulin
    Yang, Chuanguang
    Xu, Yongjun
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [46] Filter Pruning via Measuring Feature Map Information
    Shao, Linsong
    Zuo, Haorui
    Zhang, Jianlin
    Xu, Zhiyong
    Yao, Jinzhen
    Wang, Zhixing
    Li, Hong
    SENSORS, 2021, 21 (19)
  • [47] Filter Pruning by High-Order Spectral Clustering
    Lin, Hang
    Peng, Yifan
    Zhang, Yubo
    Bie, Lin
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (04) : 2402 - 2415
  • [48] Combine-Net: An Improved Filter Pruning Algorithm
    Wang, Jinghan
    Li, Guangyue
    Zhang, Wenzhao
    INFORMATION, 2021, 12 (07)
  • [49] Filter Pruning with Convolutional Approximation Small Model Framework
    Intraraprasit, Monthon
    Chitsobhuk, Orachat
    COMPUTATION, 2023, 11 (09)
  • [50] Feature independent Filter Pruning by Successive Layers analysis
    Mondal, Milton
    Das, Bishshoy
    Lall, Brejesh
    Singh, Pushpendra
    Roy, Sumantra Dutta
    Joshi, Shiv Dutt
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 236