Filter Pruning by Switching to Neighboring CNNs With Good Attributes

被引:43
|
作者
He, Yang [1 ,2 ]
Liu, Ping [1 ,2 ]
Zhu, Linchao [1 ]
Yang, Yi [3 ]
机构
[1] Univ Technol Sydney, Australian Artificial Intelligence Inst, ReLER Lab, Sydney, NSW 2007, Australia
[2] ASTAR, Ctr Frontier AI Res CFAR, Singapore 138632, Singapore
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310000, Peoples R China
基金
澳大利亚研究理事会;
关键词
Neural networks; Training; Training data; Neurons; Libraries; Graphics processing units; Electronic mail; Filter pruning; meta-attributes; network compression; neural networks; CLASSIFICATION; ACCURACY;
D O I
10.1109/TNNLS.2022.3149332
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose meta-attribute-based filter pruning (MFP). First, to expand the existing magnitude information-based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.
引用
收藏
页码:8044 / 8056
页数:13
相关论文
共 50 条
  • [1] A Study of Filter Duplication for CNNs Filter Pruning
    Ikuta, Ryosuke
    Yata, Noriko
    Manabe, Yoshitsugu
    INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY, IWAIT 2024, 2024, 13164
  • [2] FPWT: Filter pruning via wavelet transform for CNNs
    Liu, Yajun
    Fan, Kefeng
    Zhou, Wenju
    NEURAL NETWORKS, 2024, 179
  • [3] Stability Based Filter Pruning for Accelerating Deep CNNs
    Singh, Pravendra
    Kadi, Vinay Sameer Raja
    Verma, Nikhil
    Namboodiri, Vinay P.
    2019 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2019, : 1166 - 1174
  • [4] COMPRESSING AUDIO CNNS WITH GRAPH CENTRALITY BASED FILTER PRUNING
    King, James A.
    Singh, Arshdeep
    Plumbley, Mark D.
    2023 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS, WASPAA, 2023,
  • [5] Interspace Pruning: Using Adaptive Filter Representations to Improve Training of Sparse CNNs
    Wimmer, Paul
    Mehnert, Jens
    Condurache, Alexandru
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12517 - 12527
  • [6] Robust pruning for efficient CNNs A
    Ide, Hidenori
    Kobayashi, Takumi
    Watanabe, Kenji
    Kurita, Takio
    PATTERN RECOGNITION LETTERS, 2020, 135 : 90 - 98
  • [7] Compressing CNNs Using Multilevel Filter Pruning for the Edge Nodes of Multimedia Internet of Things
    Liu, Xingang
    Wu, Lishuai
    Dai, Cheng
    Chao, Han-Chieh
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (14) : 11041 - 11051
  • [8] Dynamic Structure Pruning for Compressing CNNs
    Park, Jun-Hyung
    Kim, Yeachan
    Kim, Junho
    Choi, Joon-Young
    Lee, SangKeun
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9408 - 9416
  • [9] FALF ConvNets: Fatuous auxiliary loss based filter-pruning for efficient deep CNNs
    Singh, Pravendra
    Kadi, Vinay Sameer Raja
    Namboodiri, Vinay P.
    IMAGE AND VISION COMPUTING, 2020, 93
  • [10] Pruning Filter in Filter
    Meng, Fanxu
    Cheng, Hao
    Li, Ke
    Luo, Huixiang
    Guo, Xiaowei
    Lu, Guangming
    Sun, Xing
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33