Pruning Deep Neural Networks with l0-constrained Optimization

被引:1
|
作者
Phan, Dzung T. [1 ]
Nguyen, Lam M. [1 ]
Nguyen, Nam H. [1 ]
Kalagnanam, Jayant R. [1 ]
机构
[1] IBM Res, Thomas J Watson Res Ctr, Yorktown Hts, NY 10598 USA
关键词
D O I
10.1109/ICDM50108.2020.00152
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks (DNNs) give state-of-the-art accuracy in many tasks, but they can require large amounts of memory storage, energy consumption, and long inference times. Modern DNNs can have hundreds of million parameters, which make it difficult for DNNs to be deployed in some applications with low-resource environments. Pruning redundant connections without sacrificing accuracy is one of popular approaches to overcome these limitations. We propose two l(0)-constrained optimization models for pruning deep neural networks layer-by-layer. The first model is devoted to a general activation function, while the second one is specifically for a ReLU. We introduce an efficient cutting plane algorithm to solve the latter to optimality. Our experiments show that the proposed approach achieves competitive compression rates over several state-of-the-art baseline methods.
引用
收藏
页码:1214 / 1219
页数:6
相关论文
共 50 条
  • [1] A Neurodynamic Approach to L0-Constrained Optimization
    Wang, Yadi
    Li, Xiaoping
    Wang, Jun
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 44 - 50
  • [2] L0-constrained regression for data mining
    Wu, Zhili
    Li, Chun-Hung
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS, 2007, 4426 : 981 - +
  • [3] A Novel l0-constrained Gaussian Graphical Model for Anomaly Localization
    Phan, Dzung T.
    Ide, Tsuyoshi
    Kalagnanam, Jayant
    Menickelly, Matt
    Scheinberg, Katya
    2017 17TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2017), 2017, : 830 - 833
  • [4] On the Solution of l0-Constrained Sparse Inverse Covariance Estimation Problems
    Phan, Dzung T.
    Menickelly, Matt
    INFORMS JOURNAL ON COMPUTING, 2021, 33 (02) : 531 - 550
  • [5] Newton-Type Greedy Selection Methods for l0-Constrained Minimization
    Yuan, Xiao-Tong
    Liu, Qingshan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (12) : 2437 - 2450
  • [6] Matrix-wise l0-constrained sparse nonnegative least squares
    Nadisic, Nicolas
    Cohen, Jeremy E.
    Vandaele, Arnaud
    Gillis, Nicolas
    MACHINE LEARNING, 2022, 111 (12) : 4453 - 4495
  • [7] Compressive Sensing via Unfolded l0-constrained Convolutional Sparse Coding
    Sun, Jiaqi
    Dai, Wenrui
    Li, Chenglin
    Zou, Junni
    Xiong, Hongkai
    2021 DATA COMPRESSION CONFERENCE (DCC 2021), 2021, : 183 - 192
  • [8] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    IEEE ACCESS, 2022, 10 : 63280 - 63300
  • [9] 3D mesh segmentation via L0-constrained random walks
    Yu Hou
    Yong Zhao
    Xin Shan
    Multimedia Tools and Applications, 2021, 80 : 24885 - 24899
  • [10] 3D mesh segmentation via L0-constrained random walks
    Hou, Yu
    Zhao, Yong
    Shan, Xin
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (16) : 24885 - 24899