Tensor-tensor products with invertible linear transforms

被引:198
|
作者
Kernfeld, Eric [1 ]
Kilmer, Misha [2 ]
Aeron, Shuchin [3 ]
机构
[1] Univ Washington, Dept Stat, Seattle, WA 98195 USA
[2] Tufts Univ, Dept Math, Medford, MA 02155 USA
[3] Tufts Univ, Dept Elect & Comp Engn, Medford, MA 02155 USA
基金
美国国家科学基金会;
关键词
Tensor; Multiway; SVD; DCT; Linear transformation; Module; DECOMPOSITIONS; FACTORIZATION;
D O I
10.1016/j.laa.2015.07.021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Research in tensor representation and analysis has been rising in popularity in direct response to a) the increased ability of data collection systems to store huge volumes of multidimensional data and b) the recognition of potential modeling accuracy that can be provided by leaving the data and/or the operator in its natural, multidimensional form. In recent work [1], the authors introduced the notion of the t-product, a generalization of matrix multiplication for tensors of order three, which can be extended to multiply tensors of arbitrary order [2]. The multiplication is based on a convolution-like operation, which can be implemented efficiently using the Fast Fourier Transform (FFT). The corresponding linear algebraic framework from the original work was further developed in [3], and it allows one to elegantly generalize all classical algorithms from numerical linear algebra. In this paper, we extend this development so that tensor tensor products can be defined in a so-called transform domain for any invertible linear transform. In order to properly motivate this transform-based approach, we begin by defining a new tensor tensor product alternative to the t-product. We then show that it can be implemented efficiently using DOTS, and that subsequent definitions and factorizations can be formulated by appealing to the transform domain. Using this new product as our guide, we then generalize the transform-based approach to any invertible linear transform. We introduce the algebraic structures induced by each new multiplication in the family, which is that of C*-algebras and modules. Finally, in the spirit of [4], we give a matrix algebra based interpretation of the new family of tensor tensor products, and from an applied perspective, we briefly discuss how to choose a transform. We demonstrate the convenience of our new framework within the context of an image deblurring problem and we show the potential for using one of these new tensor tensor products and resulting tensor-SVD for hyperspectral image compression. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:545 / 570
页数:26
相关论文
共 50 条
  • [1] Multilinear discriminant analysis using tensor-tensor products
    Dufrenois, Franck
    El Ichi, Alaa
    Jbilou, Khalide
    JOURNAL OF MATHEMATICAL MODELING, 2023, 11 (01): : 83 - 101
  • [2] Tensor-tensor model of gravity
    M. Y. Gogberashvili
    Theoretical and Mathematical Physics, 1997, 113 : 1572 - 1581
  • [3] TENSOR-TENSOR THEORY OF GRAVITATION
    FIRMANI, C
    ASTROPHYSICS AND SPACE SCIENCE, 1971, 13 (01) : 128 - &
  • [4] Tensor-tensor model of gravity
    Gogberashvili, MY
    THEORETICAL AND MATHEMATICAL PHYSICS, 1997, 113 (03) : 1572 - 1581
  • [5] Low-rank Tensor Completion with a New Tensor Nuclear Norm Induced by Invertible Linear Transforms
    Lu, Canyi
    Peng, Xi
    Wei, Yunchao
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 5989 - 5997
  • [6] The tensor phase under a tensor-tensor product
    Ding, Jiadong
    Choi, Hayoung
    Wei, Yimin
    Xie, Pengpeng
    COMPUTATIONAL & APPLIED MATHEMATICS, 2025, 44 (01):
  • [7] Facial Recognition Using Tensor-Tensor Decompositions
    Hao, Ning
    Kilmer, Misha E.
    Braman, Karen
    Hoover, Randy C.
    SIAM JOURNAL ON IMAGING SCIENCES, 2013, 6 (01): : 437 - 463
  • [8] STOCHASTIC CONDITIONING OF TENSOR FUNCTIONS BASED ON THE TENSOR-TENSOR PRODUCT
    Miao, Yun
    Wang, Tianru
    Wei, Yimin
    PACIFIC JOURNAL OF OPTIMIZATION, 2023, 19 (02): : 205 - 235
  • [9] Tensor factorization via transformed tensor-tensor product for image alignment
    Sijia Xia
    Duo Qiu
    Xiongjun Zhang
    Numerical Algorithms, 2024, 95 : 1251 - 1289
  • [10] Tensor factorization via transformed tensor-tensor product for image alignment
    Xia, Sijia
    Qiu, Duo
    Zhang, Xiongjun
    NUMERICAL ALGORITHMS, 2024, 95 (03) : 1251 - 1289