HIGH-ORDER TENSOR COMPLETION FOR DATA RECOVERY VIA SPARSE TENSOR-TRAIN OPTIMIZATION

被引:0
|
作者
Yuan, Longhao [1 ,2 ]
Zhao, Qibin [2 ,3 ]
Cao, Jianting [1 ,4 ]
机构
[1] Saitama Inst Technol, Grad Sch Engn, Fukaya, Japan
[2] Ctr Adv Intelligence Project AIP, RIKEN, Tensor Learning Unit, Tokyo, Japan
[3] Guangdong Univ Technol, Sch Automat, Guangzhou, Guangdong, Peoples R China
[4] Hangzhou Dianzi Univ, Sch Comp Sci & Technol, Hangzhou, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
incomplete data; tensor completion; tensor-train decomposition; tensorization; optimization;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In this paper, we aim at the problem of tensor data completion. Tensor-train decomposition is adopted because of its powerful representation ability and linear scalability to tensor order. We propose an algorithm named Sparse Tensortrain Optimization (STTO) which considers incomplete data as sparse tensor and uses first-order optimization method to find the factors of tensor-train decomposition. Our algorithm is shown to perform well in simulation experiments at both low-order cases and high-order cases. We also employ a tensorization method to transform data to a higher-order form to enhance the performance of our algorithm. The results of image recovery experiments in various cases manifest that our method outperforms other completion algorithms. Especially when the missing rate is very high, e.g., 90% to 99%, our method is significantly better than the state-of-the-art methods.
引用
收藏
页码:1258 / 1262
页数:5
相关论文
共 50 条
  • [1] Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition
    Yuan, Longhao
    Zhao, Qibin
    Cao, Jianting
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 222 - 229
  • [2] Optimal High-Order Tensor SVD via Tensor-Train Orthogonal Iteration
    Zhou, Yuchen
    Zhang, Anru R.
    Zheng, Lili
    Wang, Yazhen
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (06) : 3991 - 4019
  • [3] High-order tensor completion via gradient-based optimization under tensor train format
    Yuan, Longhao
    Zhao, Qibin
    Gui, Lihua
    Cao, Jianting
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 (53-61) : 53 - 61
  • [4] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    Journal of Machine Learning Research, 2022, 23
  • [5] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 77
  • [6] A neural tensor decomposition model for high-order sparse data recovery
    Liao, Tianchi
    Yang, Jinghua
    Chen, Chuan
    Zheng, Zibin
    INFORMATION SCIENCES, 2024, 658
  • [7] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [8] Tensor train completion: Local recovery guarantees via Riemannian optimization
    Budzinskiy, Stanislav
    Zamarashkin, Nikolai
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2023, 30 (06)
  • [9] Deep Learning Approach Based on Tensor-Train for Sparse Signal Recovery
    Zou, Cong
    Yang, Fang
    IEEE ACCESS, 2019, 7 : 34753 - 34761
  • [10] A Riemannian Rank-Adaptive Method for Higher-Order Tensor Completion in the Tensor-Train Format
    Vermeylen, Charlotte
    Van Barel, Marc
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2025, 32 (01)