Randomized algorithms for fast computation of low rank tensor ring model

被引:19
|
作者
Ahmadi-Asl, Salman [1 ]
Cichocki, Andrzej [1 ,2 ]
Huy Phan, Anh [1 ]
Asante-Mensah, Maame G. [1 ]
Musavian Ghazani, Mirfarid [1 ]
Tanaka, Toshihisa [3 ]
Oseledets, Ivan [1 ]
机构
[1] Skolkovo Inst Sci & Technol SKOLTECH, CDISE, Moscow, Russia
[2] Nicolaus Copernicus Univ, PL-87100 Torun, Poland
[3] Tokyo Univ Agr & Technol, Tokyo, Japan
来源
MACHINE LEARNING-SCIENCE AND TECHNOLOGY | 2021年 / 2卷 / 01期
关键词
Tensor Ring-Tensor Train (TR-TT) decompositions; randomized algorithm; random projection; MATRIX PRODUCT STATES; LARGE-SCALE MATRICES; RENORMALIZATION-GROUP; APPROXIMATION; DECOMPOSITION; OPTIMIZATION; COMPLETION; NETWORKS; REDUCTION;
D O I
10.1088/2632-2153/abad87
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Randomized algorithms are efficient techniques for big data tensor analysis. In this tutorial paper, we review and extend a variety of randomized algorithms for decomposing large-scale data tensors in Tensor Ring (TR) format. We discuss both adaptive and nonadaptive randomized algorithms for this task. Our main focus is on the random projection technique as an efficient randomized framework and how it can be used to decompose large-scale data tensors in the TR format. Simulations are provided to support the presentation and efficiency, and performance of the presented algorithms are compared.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Fast and Accurate Randomized Algorithms for Low-rank Tensor Decompositions
    Ma, Linjian
    Solomonik, Edgar
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Randomized Algorithms for Low-Rank Tensor Decompositions in the Tucker Format
    Minster, Rachel
    Saibaba, Arvind K.
    Kilmer, Misha E.
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 189 - 215
  • [3] Fast randomized tensor singular value thresholding for low-rank tensor optimization
    Che, Maolin
    Wang, Xuezhong
    Wei, Yimin
    Zhao, Xile
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (06)
  • [4] Parallel Implementation of Fast Randomized Algorithms for Low Rank Matrix Decomposition
    Lucas, Andrew
    Stalzer, Mark
    Feo, John
    PARALLEL PROCESSING LETTERS, 2014, 24 (01)
  • [5] Efficient Low Rank Tensor Ring Completion
    Wang, Wenqi
    Aggarwal, Vaneet
    Aeron, Shuchin
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5698 - 5706
  • [6] LOW-RANK TENSOR RING MODEL FOR COMPLETING MISSING VISUAL DATA
    Asif, M. Salman
    Prater-Bennette, Ashley
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 5415 - 5419
  • [7] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [8] Fast hypergraph regularized nonnegative tensor ring decomposition based on low-rank approximation
    Zhao, Xinhai
    Yu, Yuyuan
    Zhou, Guoxu
    Zhao, Qibin
    Sun, Weijun
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17684 - 17707
  • [9] Fast nonnegative tensor ring decomposition based on the modulus method and low-rank approximation
    YuYuan Yu
    Kan Xie
    JinShi Yu
    Qi Jiang
    ShengLi Xie
    Science China Technological Sciences, 2021, 64 : 1843 - 1853
  • [10] Fast nonnegative tensor ring decomposition based on the modulus method and low-rank approximation
    Yu YuYuan
    Xie Kan
    Yu JinShi
    Jiang Qi
    Xie ShengLi
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2021, 64 (09) : 1843 - 1853