Convolutional Neural Network Compression via Tensor-Train Decomposition on Permuted Weight Tensor with Automatic Rank Determination

被引:2
|
作者
Gabor, Mateusz [1 ]
Zdunek, Rafal [1 ]
机构
[1] Wroclaw Univ Sci & Technol, Fac Elect Photon & Microsyst, Wybrzeze Wyspianskiego 27, PL-50370 Wroclaw, Poland
来源
关键词
Neural network compression; Convolutional neural network; Tensor decomposition; Tensor train decomposition;
D O I
10.1007/978-3-031-08757-8_54
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Convolutional neural networks (CNNs) are among the most commonly investigated models in computer vision. Deep CNNs yield high computational performance, but their common issue is a large size. For solving this problem, it is necessary to find effective compression methods which can effectively reduce the size of the network, keeping the accuracy on a similar level. This study provides important insights into the field of CNNs compression, introducing a novel low-rank compression method based on tensor-train decomposition on a permuted kernel weight tensor with automatic rank determination. The proposed method is easy to implement, and it allows us to fine-tune neural networks from decomposed factors instead of learning them from scratch. The results of this study examined on various CNN architectures and two datasets demonstrated that the proposed method outperforms other CNNs compression methods with respect to parameter and FLOPS compression at a low drop in the classification accuracy.
引用
收藏
页码:654 / 667
页数:14
相关论文
共 49 条
  • [1] Compact lossy compression of tensors via neural tensor-train decomposition
    Kwon, Taehyung
    Ko, Jihoon
    Jung, Jinhong
    Jang, Jun-Gi
    Shin, Kijung
    KNOWLEDGE AND INFORMATION SYSTEMS, 2025, 67 (02) : 1169 - 1211
  • [2] Tensor rank learning in CP decomposition via convolutional neural network
    Zhou, Mingyi
    Liu, Yipeng
    Long, Zhen
    Chen, Longxi
    Zhu, Ce
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 : 12 - 21
  • [3] Deep Convolutional Neural Network Compression via Coupled Tensor Decomposition
    Sun, Weize
    Chen, Shaowu
    Huang, Lei
    So, Hing Cheung
    Xie, Min
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2021, 15 (03) : 603 - 616
  • [4] Accelerating Tensor Contraction Products via Tensor-Train Decomposition [Tips & Tricks]
    Kisil, Ilya
    Calvi, Giuseppe G.
    Konstantinidis, Kriton
    Xu, Yao Lei
    Mandic, Danilo P.
    IEEE SIGNAL PROCESSING MAGAZINE, 2022, 39 (05) : 63 - 70
  • [5] CTNN: A Convolutional Tensor-Train Neural Network for Multi-Task Brainprint Recognition
    Jin, Xuanyu
    Tang, Jiajia
    Kong, Xianghao
    Peng, Yong
    Cao, Jianting
    Zhao, Qibin
    Kong, Wanzeng
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2021, 29 : 103 - 112
  • [6] Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition
    Yuan, Longhao
    Zhao, Qibin
    Cao, Jianting
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 222 - 229
  • [7] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [8] PROBABILISTIC TENSOR TRAIN DECOMPOSITION WITH AUTOMATIC RANK DETERMINATION FROM NOISY DATA
    Xu, Le
    Cheng, Lei
    Wong, Ngai
    Wu, Yik-Chung
    2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 461 - 465
  • [9] Learning a deep convolutional neural network via tensor decomposition
    Oymak, Samet
    Soltanolkotabi, Mahdi
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2021, 10 (03) : 1031 - 1071
  • [10] TT-ViT: Vision Transformer Compression Using Tensor-Train Decomposition
    Hoang Pham Minh
    Nguyen Nguyen Xuan
    Son Tran Thai
    COMPUTATIONAL COLLECTIVE INTELLIGENCE, ICCCI 2022, 2022, 13501 : 755 - 767