Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression

被引:0
|
作者
Wei Yang
Yancai Xiao
机构
[1] Electronic and Control Engineering Beijing Jiaotong University,School of Mechanical
[2] Beijing Jiaotong University,Key Laboratory of Vehicle Advanced Manufacturing, Measuring and Control Technology, Ministry of Education
来源
Applied Intelligence | 2022年 / 52卷
关键词
Convolutional neural network; Model compression; Feature channels similarity; Mutual learning;
D O I
暂无
中图分类号
学科分类号
摘要
The development of convolutional neural network (CNN) have been hindered in resource-constrained devices due to its large memory and calculation. To obtain a light-weight network, we propose feature channels similarity and mutual learning fine tuning (FCS-MLFT) method. To begin with, we focus on the similarity redundancy between the output feature channels of CNN, and propose a novel structured pruning criterion based on the Cosine Similarity, moreover, we use K-Means to cluster the convolution kernels corresponding to the L1 norm of the feature maps into several bins, and calculate the similarity values between feature channels in each bin. Then, different from the traditional method of using the same strategy as the training process to improve the accuracy of the compressed model, we apply mutual learning fine tuning (MLFT) to improve the accuracy of the compact model and the accuracy obtained by the proposed method can achieve the accuracy of the traditional fine tuning (TFT) while significantly shortening the number of epochs. The experimental results not only show the performance of FCS method outperform the existing criteria, such as kernel norm-based and the layer-wise feature norm-based methods, but also prove that MLFT strategy can reduce the number of epochs.
引用
收藏
页码:14560 / 14570
页数:10
相关论文
共 50 条
  • [41] Thinning of convolutional neural network with mixed pruning
    Yang, Wenzhu
    Jin, Lilei
    Wang, Sile
    Cu, Zhenchao
    Chen, Xiangyang
    Chen, Liping
    IET IMAGE PROCESSING, 2019, 13 (05) : 779 - 784
  • [42] Feature representation fidelity preservation during neural network pruning for enhanced compression efficiency
    Ambuj
    NEUROCOMPUTING, 2025, 634
  • [43] Convolutional neural network-based multimodal image fusion via similarity learning in the shearlet domain
    Haithem Hermessi
    Olfa Mourali
    Ezzeddine Zagrouba
    Neural Computing and Applications, 2018, 30 : 2029 - 2045
  • [44] Convolutional neural network-based multimodal image fusion via similarity learning in the shearlet domain
    Hermessi, Haithem
    Mourali, Olfa
    Zagrouba, Ezzeddine
    NEURAL COMPUTING & APPLICATIONS, 2018, 30 (07): : 2029 - 2045
  • [45] Structured pruning of neural networks for constraints learning
    Cacciola, Matteo
    Frangioni, Antonio
    Lodi, Andrea
    OPERATIONS RESEARCH LETTERS, 2024, 57
  • [46] Feature Learning and Transfer Performance Prediction for Video Reinforcement Learning Tasks via a Siamese Convolutional Neural Network
    Song, Jinhua
    Gao, Yang
    Wang, Hao
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 350 - 361
  • [47] Quantisation and Pruning for Neural Network Compression and Regularisation
    Paupamah, Kimessha
    James, Steven
    Klein, Richard
    2020 INTERNATIONAL SAUPEC/ROBMECH/PRASA CONFERENCE, 2020, : 295 - 300
  • [48] Automated Pruning for Deep Neural Network Compression
    Manessi, Franco
    Rozza, Alessandro
    Bianco, Simone
    Napoletano, Paolo
    Schettini, Raimondo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 657 - 664
  • [49] Neural Network Compression and Acceleration by Federated Pruning
    Pei, Songwen
    Wu, Yusheng
    Qiu, Meikang
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2020, PT II, 2020, 12453 : 173 - 183
  • [50] Automatic Compression Ratio Allocation for Pruning Convolutional Neural Networks
    Liu, Yunfeng
    Kong, Huihui
    Yu, Peihua
    ICVISP 2019: PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING, 2019,