UNIQUENESS OF TENSOR TRAIN DECOMPOSITION WITH LINEAR DEPENDENCIES

被引:0
|
作者
Zniyed, Yassine [1 ]
Miron, Sebastian [2 ]
Boyer, Remy [3 ]
Brie, David [2 ]
机构
[1] Cent Supelec, Lab Signaux & Syst, Gif Sur Yvette, France
[2] Univ Lorraine, CNRS, CRAN, Vandoeuvre Les Nancy, France
[3] Univ Lille, Lab CRIStAL, Villeneuve Dascq, France
关键词
Tensor Train; PARALIND; identifiability; 3-WAY ARRAYS;
D O I
10.1109/camsap45676.2019.9022651
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
With the increase in measurement/sensing technologies, the collected data are intrinsically multidimensional in a large number of applications. This can be interpreted as a growth of the dimensionality/order of the associated tensor. There exists therefore a crucial need to derive equivalent and alternative models of a high-order tensor as a graph of low-order tensors. In this work we consider a "train" graph, i.e., a Q-order tensor will be represented as a Tensor Train (TT) composed of Q - 2 3-order core tensors and two core matrices. In this context, it has been shown that a canonical rank-R CPD model can always be represented exactly by a TT model whose cores are canonical rank-R CPD. This model is called TT-CPD. We generalize this equivalence to the PARALIND model in order to take into account potential linear dependencies in factors. We derive and discuss here uniqueness conditions for the case of the TT-PARALIND model.
引用
收藏
页码:460 / 464
页数:5
相关论文
共 50 条
  • [21] Block tensor train decomposition for missing data estimation
    Namgil Lee
    Jong-Min Kim
    Statistical Papers, 2018, 59 : 1283 - 1305
  • [22] Block tensor train decomposition for missing data estimation
    Lee, Namgil
    Kim, Jong-Min
    STATISTICAL PAPERS, 2018, 59 (04) : 1283 - 1305
  • [23] Active Fault Detection Based on Tensor Train Decomposition
    Puncochar, Ivo
    Straka, Ondrej
    Tichaysk, Petr
    IFAC PAPERSONLINE, 2024, 58 (04): : 676 - 681
  • [24] PARALLEL ALGORITHMS FOR COMPUTING THE TENSOR-TRAIN DECOMPOSITION
    Shi, Tianyi
    Ruth, Maximilian
    Townsend, Alex
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2023, 45 (03): : C101 - C130
  • [25] Nimble GNN Embedding with Tensor-Train Decomposition
    Yin, Chunxing
    Zheng, Da
    Nisa, Israt
    Faloutsos, Christos
    Karypis, George
    Vuduc, Richard
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 2327 - 2335
  • [26] Linear dependencies in fourth-rank turbulence tensor models
    Reynolds, WC
    Kassinos, SC
    APPLIED MATHEMATICS LETTERS, 1998, 11 (05) : 79 - 83
  • [27] Accelerating Tensor Contraction Products via Tensor-Train Decomposition [Tips & Tricks]
    Kisil, Ilya
    Calvi, Giuseppe G.
    Konstantinidis, Kriton
    Xu, Yao Lei
    Mandic, Danilo P.
    IEEE SIGNAL PROCESSING MAGAZINE, 2022, 39 (05) : 63 - 70
  • [28] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384
  • [29] A Practical Approach for Employing Tensor Train Decomposition in Edge Devices
    Milad Kokhazadeh
    Georgios Keramidas
    Vasilios Kelefouras
    Iakovos Stamoulis
    International Journal of Parallel Programming, 2024, 52 : 20 - 39
  • [30] Compressing 3DCNNs based on tensor train decomposition
    Wang, Dingheng
    Zhao, Guangshe
    Li, Guoqi
    Deng, Lei
    Wu, Yang
    NEURAL NETWORKS, 2020, 131 : 215 - 230