Accelerating Convolutional Neural Networks with Dominant Convolutional Kernel and Knowledge Pre-regression

被引:13
|
作者
Wang, Zhenyang [1 ]
Deng, Zhidong [1 ]
Wang, Shiyao [1 ]
机构
[1] Tsinghua Univ, Dept Comp Sci, Tsinghua Natl Lab Informat Sci & Technol, State Key Lab Intelligent Technol & Syst, Beijing 100084, Peoples R China
来源
关键词
Dominant convolutional kernel; Knowledge pre-regression; Model compression; Knowledge distilling;
D O I
10.1007/978-3-319-46484-8_32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aiming at accelerating the test time of deep convolutional neural networks (CNNs), we propose a model compression method that contains a novel dominant kernel (DK) and a new training method called knowledge pre-regression (KP). In the combined model DK(2)PNet, DK is presented to significantly accomplish a low-rank decomposition of convolutional kernels, while KP is employed to transfer knowledge of intermediate hidden layers from a larger teacher network to its compressed student network on the basis of a cross entropy loss function instead of previous Euclidean distance. Compared to the latest results, the experimental results achieved on CIFAR-10, CIFAR-100, MNIST, and SVHN benchmarks show that our DK(2)PNet method has the best performance in the light of being close to the state of the art accuracy and requiring dramatically fewer number of model parameters.
引用
收藏
页码:533 / 548
页数:16
相关论文
共 50 条
  • [11] FastWave: Accelerating Autoregressive Convolutional Neural Networks on FPGA
    Hussain, Shehzeen
    Javaheripi, Mojan
    Neekhara, Paarth
    Kastner, Ryan
    Koushanfar, Farinaz
    2019 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN (ICCAD), 2019,
  • [12] A Method for Accelerating Convolutional Neural Networks Based on FPGA
    Zhao, Mengxing
    Li, Xiang
    Zhu, Shunyi
    Zhou, Li
    2019 4TH INTERNATIONAL CONFERENCE ON COMMUNICATION AND INFORMATION SYSTEMS (ICCIS 2019), 2019, : 241 - 246
  • [13] Accelerating Convolutional Neural Networks with Dynamic Channel Pruning
    Zhang, Chiliang
    Hu, Tao
    Guan, Yingda
    Ye, Zuochang
    2019 DATA COMPRESSION CONFERENCE (DCC), 2019, : 563 - 563
  • [14] Accelerating Backward Convolution of Convolutional Neural Networks on BWDSP
    Yang, Jiangping
    Wang, Gai
    Lu, Maohui
    Zheng, Qilong
    2018 9TH INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES, ALGORITHMS AND PROGRAMMING (PAAP 2018), 2018, : 163 - 170
  • [15] Accelerating cardiovascular model building with convolutional neural networks
    Gabriel Maher
    Nathan Wilson
    Alison Marsden
    Medical & Biological Engineering & Computing, 2019, 57 : 2319 - 2335
  • [16] Convolutional Neural Networks for Financial Text Regression
    Dereli, Nesat
    Saraclar, Murat
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019:): STUDENT RESEARCH WORKSHOP, 2019, : 331 - 337
  • [17] Convolutional Kernel Networks
    Mairal, Julien
    Konius, Piotr
    Harchaoui, Zaid
    Schmid, Cordelia
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [18] Versatile kernel reactivation for deep convolutional neural networks
    Lee, Jeong Jun
    Kim, Hyun
    ELECTRONICS LETTERS, 2022, 58 (19) : 723 - 725
  • [19] SpreadOut: A Kernel Weight Initializer for Convolutional Neural Networks
    Hertzog, Matheus I.
    Correa, Ulisses Brisolara
    Araujo, Ricardo M.
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [20] Kernel Support Vector Machines and Convolutional Neural Networks
    Jiang, Shihao
    Hartley, Richard
    Fernando, Basura
    2018 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2018, : 560 - 566