A tensor compression algorithm using Tucker decomposition and dictionary dimensionality reduction

被引:2
|
作者
Gan, Chenquan [1 ,2 ,3 ]
Mao, Junwei [1 ,2 ,3 ]
Zhang, Zufan [1 ,2 ,3 ]
Zhu, Qingyi [4 ]
机构
[1] Chongqing Univ Posts & Telecommun, Sch Commun & Informat Engn, Chongqing 400065, Peoples R China
[2] Chongqing Key Lab Mobile Commun Technol, Chongqing, Peoples R China
[3] Minist Educ, Engn Res Ctr Mobile Commun, Chongqing, Peoples R China
[4] Chongqing Univ Posts & Telecommun, Sch Cyber Secur & Informat Law, Chongqing, Peoples R China
关键词
Tensor signal compression; Tucker decomposition; sparse representation; dictionary learning; denoising ability;
D O I
10.1177/1550147720916408
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tensor compression algorithms play an important role in the processing of multidimensional signals. In previous work, tensor data structures are usually destroyed by vectorization operations, resulting in information loss and new noise. To this end, this article proposes a tensor compression algorithm using Tucker decomposition and dictionary dimensionality reduction, which mainly includes three parts: tensor dictionary representation, dictionary preprocessing, and dictionary update. Specifically, the tensor is respectively performed by the sparse representation and Tucker decomposition, from which one can obtain the dictionary, sparse coefficient, and core tensor. Furthermore, the sparse representation can be obtained through the relationship between sparse coefficient and core tensor. In addition, the dimensionality of the input tensor is reduced by using the concentrated dictionary learning. Finally, some experiments show that, compared with other algorithms, the proposed algorithm has obvious advantages in preserving the original data information and denoising ability.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Faster quantum state decomposition with Tucker tensor approximation
    Protasov Stanislav
    Lisnichenko Marina
    Quantum Machine Intelligence, 2023, 5
  • [42] Faster quantum state decomposition with Tucker tensor approximation
    Stanislav, Protasov
    Marina, Lisnichenko
    QUANTUM MACHINE INTELLIGENCE, 2023, 5 (02)
  • [43] Discriminative Nonnegative Tucker Decomposition for Tensor Data Representation
    Jing, Wenjing
    Lu, Linzhang
    Liu, Qilong
    MATHEMATICS, 2022, 10 (24)
  • [44] L1-Norm Tucker Tensor Decomposition
    Chachlakis, Dimitris G.
    Prater-Bennette, Ashley
    Markopoulos, Panos P.
    IEEE ACCESS, 2019, 7 : 178454 - 178465
  • [45] A Framework for Data Representation, Processing, and Dimensionality Reduction with the Best-Rank Tensor Decomposition
    Cyganek, Boguslaw
    PROCEEDINGS OF THE ITI 2012 34TH INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY INTERFACES (ITI), 2012, : 325 - 330
  • [46] Robust tensor decomposition with kernel rescaled error loss for feature extraction and dimensionality reduction
    Zhang, Shuaishuai
    Wang, Xiaofeng
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 270
  • [47] A Tensor Approximation Approach to Dimensionality Reduction
    Hongcheng Wang
    Narendra Ahuja
    International Journal of Computer Vision, 2008, 76 : 217 - 229
  • [48] Multilinear (tensor) ICA and dimensionality reduction
    Vasilescu, M. Alex O.
    Terzopoulos, Demetri
    INDEPENDENT COMPONENT ANALYSIS AND SIGNAL SEPARATION, PROCEEDINGS, 2007, 4666 : 818 - +
  • [49] Dimensionality Reduction Assisted Tensor Clustering
    Sun, Yanfeng
    Gao, Junbin
    Hong, Xia
    Guo, Yi
    Harris, Chris J.
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 1565 - 1572
  • [50] A tensor approximation approach to dimensionality reduction
    Wang, Hongcheng
    Ahuja, Narendra
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2008, 76 (03) : 217 - 229