Filter Pruning by Switching to Neighboring CNNs With Good Attributes

被引:43
|
作者
He, Yang [1 ,2 ]
Liu, Ping [1 ,2 ]
Zhu, Linchao [1 ]
Yang, Yi [3 ]
机构
[1] Univ Technol Sydney, Australian Artificial Intelligence Inst, ReLER Lab, Sydney, NSW 2007, Australia
[2] ASTAR, Ctr Frontier AI Res CFAR, Singapore 138632, Singapore
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310000, Peoples R China
基金
澳大利亚研究理事会;
关键词
Neural networks; Training; Training data; Neurons; Libraries; Graphics processing units; Electronic mail; Filter pruning; meta-attributes; network compression; neural networks; CLASSIFICATION; ACCURACY;
D O I
10.1109/TNNLS.2022.3149332
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose meta-attribute-based filter pruning (MFP). First, to expand the existing magnitude information-based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.
引用
收藏
页码:8044 / 8056
页数:13
相关论文
共 50 条
  • [31] Attributes of a good judge
    Kyrou, Justice Emilios
    JOURNAL OF JUDICIAL ADMINISTRATION, 2013, 23 (02):
  • [32] Are a Few Neighboring Peers Good Enough?
    Zhong, Lili
    Dai, Jie
    Li, Bo
    Li, Baochun
    Jin, Hai
    2010 IEEE GLOBAL TELECOMMUNICATIONS CONFERENCE GLOBECOM 2010, 2010,
  • [33] Dissecting traffic fingerprinting CNNs with filter activations
    Dahanayaka, Thilini
    Jourjon, Guillaume
    Seneviratne, Suranga
    COMPUTER NETWORKS, 2022, 206
  • [34] Discrete cosine transform for filter pruning
    Chen, Yaosen
    Zhou, Renshuang
    Guo, Bing
    Shen, Yan
    Wang, Wei
    Wen, Xuming
    Suo, Xinhua
    APPLIED INTELLIGENCE, 2023, 53 (03) : 3398 - 3414
  • [35] Soft independence guided filter pruning
    Yang, Liu
    Gu, Shiqiao
    Shen, Chenyang
    Zhao, Xile
    Hu, Qinghua
    PATTERN RECOGNITION, 2024, 153
  • [36] RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs
    Xu, Zhiwei
    Ajanthan, Thalaiyasingam
    Vineet, Vibhav
    Hartley, Richard
    2020 INTERNATIONAL CONFERENCE ON 3D VISION (3DV 2020), 2020, : 180 - 189
  • [37] An Efficient Channel-level Pruning for CNNs without Fine-tuning
    Xu, Zhongtian
    Sun, Jingwei
    Liu, Yunjie
    Sun, Guangzhong
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [38] Discrete cosine transform for filter pruning
    Yaosen Chen
    Renshuang Zhou
    Bing Guo
    Yan Shen
    Wei Wang
    Xuming Wen
    Xinhua Suo
    Applied Intelligence, 2023, 53 : 3398 - 3414
  • [39] FILTER PRUNING VIA SOFTMAX ATTENTION
    Cho, Sungmin
    King, Hyeseong
    Kwon, Junseok
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 3507 - 3511
  • [40] Weighting and Pruning of Decision Rules by Attributes and Attribute Rankings
    Stanczyk, Urszula
    COMPUTER AND INFORMATION SCIENCES, ISCIS 2016, 2016, 659 : 106 - 114