UNIQUENESS OF TENSOR TRAIN DECOMPOSITION WITH LINEAR DEPENDENCIES

被引:0
|
作者
Zniyed, Yassine [1 ]
Miron, Sebastian [2 ]
Boyer, Remy [3 ]
Brie, David [2 ]
机构
[1] Cent Supelec, Lab Signaux & Syst, Gif Sur Yvette, France
[2] Univ Lorraine, CNRS, CRAN, Vandoeuvre Les Nancy, France
[3] Univ Lille, Lab CRIStAL, Villeneuve Dascq, France
关键词
Tensor Train; PARALIND; identifiability; 3-WAY ARRAYS;
D O I
10.1109/camsap45676.2019.9022651
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
With the increase in measurement/sensing technologies, the collected data are intrinsically multidimensional in a large number of applications. This can be interpreted as a growth of the dimensionality/order of the associated tensor. There exists therefore a crucial need to derive equivalent and alternative models of a high-order tensor as a graph of low-order tensors. In this work we consider a "train" graph, i.e., a Q-order tensor will be represented as a Tensor Train (TT) composed of Q - 2 3-order core tensors and two core matrices. In this context, it has been shown that a canonical rank-R CPD model can always be represented exactly by a TT model whose cores are canonical rank-R CPD. This model is called TT-CPD. We generalize this equivalence to the PARALIND model in order to take into account potential linear dependencies in factors. We derive and discuss here uniqueness conditions for the case of the TT-PARALIND model.
引用
收藏
页码:460 / 464
页数:5
相关论文
共 50 条
  • [11] Tensor-Train decomposition for image recognition
    D. Brandoni
    V. Simoncini
    Calcolo, 2020, 57
  • [12] A Randomized Tensor Train Singular Value Decomposition
    Huber, Benjamin
    Schneider, Reinhold
    Wolf, Sebastian
    COMPRESSED SENSING AND ITS APPLICATIONS, 2017, : 261 - 290
  • [13] A continuous analogue of the tensor-train decomposition
    Gorodetsky, Alex
    Karaman, Sertac
    Marzouk, Youssef
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2019, 347 : 59 - 84
  • [14] THE ALTERNATING LINEAR SCHEME FOR TENSOR OPTIMIZATION IN THE TENSOR TRAIN FORMAT
    Holtz, Sebastian
    Rohwedder, Thorsten
    Schneider, Reinhold
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2012, 34 (02): : A683 - A713
  • [15] Tensor Denoising Using Low-Rank Tensor Train Decomposition
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    Ai, Bo
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1685 - 1689
  • [16] ON UNIQUENESS OF THE CANONICAL TENSOR DECOMPOSITION WITH SOME FORM OF SYMMETRY
    Stegeman, Alwin
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2011, 32 (02) : 561 - 583
  • [17] ON UNIQUENESS OF THE nTH ORDER TENSOR DECOMPOSITION INTO RANK-1 TERMS WITH LINEAR INDEPENDENCE IN ONE MODE
    Stegeman, Alwin
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2010, 31 (05) : 2498 - 2516
  • [18] Distributed Non-Negative Tensor Train Decomposition
    Bhattarai, Manish
    Chennupati, Gopinath
    Skau, Erik
    Vangara, Raviteja
    Djidjev, Hirsto
    Alexandrov, Boian S.
    2020 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2020,
  • [19] Distributed and Randomized Tensor Train Decomposition for Feature Extraction
    Fonal, Krzysztof
    Zdunek, Rafal
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [20] Block Tensor Train Decomposition for Missing Value Imputation
    Lee, Namgil
    2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 1338 - 1343