Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression

被引:0
|
作者
Wei Yang
Yancai Xiao
机构
[1] Electronic and Control Engineering Beijing Jiaotong University,School of Mechanical
[2] Beijing Jiaotong University,Key Laboratory of Vehicle Advanced Manufacturing, Measuring and Control Technology, Ministry of Education
来源
Applied Intelligence | 2022年 / 52卷
关键词
Convolutional neural network; Model compression; Feature channels similarity; Mutual learning;
D O I
暂无
中图分类号
学科分类号
摘要
The development of convolutional neural network (CNN) have been hindered in resource-constrained devices due to its large memory and calculation. To obtain a light-weight network, we propose feature channels similarity and mutual learning fine tuning (FCS-MLFT) method. To begin with, we focus on the similarity redundancy between the output feature channels of CNN, and propose a novel structured pruning criterion based on the Cosine Similarity, moreover, we use K-Means to cluster the convolution kernels corresponding to the L1 norm of the feature maps into several bins, and calculate the similarity values between feature channels in each bin. Then, different from the traditional method of using the same strategy as the training process to improve the accuracy of the compressed model, we apply mutual learning fine tuning (MLFT) to improve the accuracy of the compact model and the accuracy obtained by the proposed method can achieve the accuracy of the traditional fine tuning (TFT) while significantly shortening the number of epochs. The experimental results not only show the performance of FCS method outperform the existing criteria, such as kernel norm-based and the layer-wise feature norm-based methods, but also prove that MLFT strategy can reduce the number of epochs.
引用
收藏
页码:14560 / 14570
页数:10
相关论文
共 50 条
  • [21] Variational Convolutional Neural Network Pruning
    Zhao, Chenglong
    Ni, Bingbing
    Zhang, Jian
    Zhao, Qiwei
    Zhang, Wenjun
    Tian, Qi
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2775 - 2784
  • [22] Convolutional Neural Network Pruning: A Survey
    Xu, Sheng
    Huang, Anran
    Chen, Lei
    Zhang, Baochang
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7458 - 7463
  • [23] Accelerating Convolutional Neural Network Pruning via Spatial Aura Entropy
    Musat, Bogdan
    Andonie, Razvan
    2023 27TH INTERNATIONAL CONFERENCE INFORMATION VISUALISATION, IV, 2023, : 286 - 291
  • [24] Learning Filter Basis for Convolutional Neural Network Compression
    Li, Yawei
    Gu, Shuhang
    Van Gool, Luc
    Timofte, Radu
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 5622 - 5631
  • [25] Adversarial Structured Neural Network Pruning
    Cai, Xingyu
    Yi, Jinfeng
    Zhang, Fan
    Rajasekaran, Sanguthevar
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2433 - 2436
  • [26] Texture Similarity Evaluation via Siamese Convolutional Neural Network
    Hudec, Lukas
    Benesova, Wanda
    2018 25TH INTERNATIONAL CONFERENCE ON SYSTEMS, SIGNALS AND IMAGE PROCESSING (IWSSIP), 2018,
  • [27] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [28] STRUCTURED PRUNING FOR GROUP REGULARIZED CONVOLUTIONAL NEURAL NETWORKS VIA DYNAMIC REGULARIZATION FACTOR
    Li, Feng
    Li, Bo
    Zhu, Meijiao
    Ma, Junchi
    Yuan, Jinlong
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2025, 21 (02) : 1440 - 1455
  • [29] Dirichlet Pruning for Neural Network Compression
    Adamczewski, Kamil
    Park, Mijung
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [30] A Feature Map Lossless Compression Framework for Convolutional Neural Network Accelerators
    Zhang, Zekun
    Jiao, Xin
    Xu, Chengyu
    2024 IEEE 6TH INTERNATIONAL CONFERENCE ON AI CIRCUITS AND SYSTEMS, AICAS 2024, 2024, : 422 - 426