Using Feature Entropy to Guide Filter Pruning for Efficient Convolutional Networks

被引:9
|
作者
Li, Yun [1 ]
Wang, Luyang [1 ]
Peng, Sifan [1 ]
Kumar, Aakash [1 ]
Yin, Baoqun [1 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei, Peoples R China
关键词
Convolutional neural networks; Filter pruning; Entropy; Features selection module;
D O I
10.1007/978-3-030-30484-3_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The rapid development of convolutional neural networks (CNNs) is usually accompanied by an increase in model volume and computational cost. In this paper, we propose an entropy-based filter pruning (EFP) method to learn more efficient CNNs. Different from many existing filter pruning approaches, our proposed method prunes unimportant filters based on the amount of information carried by their corresponding feature maps. We employ entropy to measure the information contained in the feature maps and design features selection module to formulate pruning strategies. Pruning and fine-tuning are iterated several times, yielding thin and more compact models with comparable accuracy. We empirically demonstrate the effectiveness of our method with many advanced CNNs on several benchmark datasets. Notably, for VGG-16 on CIFAR-10, our EFP method prunes 92.9% parameters and reduces 76% float-point-operations (FLOPs) without accuracy loss, which has advanced the state-of-the-art.
引用
收藏
页码:263 / 274
页数:12
相关论文
共 50 条
  • [1] Pruning feature maps for efficient convolutional neural networks
    Guo, Xiao-ting
    Xie, Xin-shu
    Lang, Xun
    OPTIK, 2023, 281
  • [2] Auto-Balanced Filter Pruning for Efficient Convolutional Neural Networks
    Ding, Xiaohan
    Ding, Guiguang
    Han, Jungong
    Tang, Sheng
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 6797 - 6804
  • [3] Filter Pruning for Efficient Transfer Learning in Deep Convolutional Neural Networks
    Reinhold, Caique
    Roisenberg, Mauro
    ARTIFICIAL INTELLIGENCEAND SOFT COMPUTING, PT I, 2019, 11508 : 191 - 202
  • [4] Filter Level Pruning Based on Similar Feature Extraction for Convolutional Neural Networks
    Li, Lianqiang
    Xu, Yuhui
    Zhu, Jie
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2018, E101D (04) : 1203 - 1206
  • [5] Filter pruning by quantifying feature similarity and entropy of feature maps
    Liu, Yajun
    Fan, Kefeng
    Wu, Dakui
    Zhou, Wenju
    NEUROCOMPUTING, 2023, 544
  • [6] Filter pruning with a feature map entropy importance criterion for convolution neural networks compressing
    Wang, Jielei
    Jiang, Ting
    Cui, Zongyong
    Cao, Zongjie
    NEUROCOMPUTING, 2021, 461 : 41 - 54
  • [7] Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning
    Singh, Pravendra
    Verma, Vinay Kumar
    Rai, Piyush
    Namboodiri, Vinay P.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 838 - 847
  • [8] Feature Statistics Guided Efficient Filter Pruning
    Li, Hang
    Ma, Chen
    Xu, Wei
    Liu, Xue
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 2619 - 2625
  • [9] Entropy Induced Pruning Framework for Convolutional Neural Networks
    Lu, Yiheng
    Guan, Ziyu
    Yang, Yaming
    Zhao, Wei
    Gong, Maoguo
    Xu, Cai
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 4, 2024, : 3918 - 3926
  • [10] FPC: Filter pruning via the contribution of output feature map for deep convolutional neural networks acceleration
    Chen, Yanming
    Wen, Xiang
    Zhang, Yiwen
    He, Qiang
    KNOWLEDGE-BASED SYSTEMS, 2022, 238