A low-rank and sparse enhanced Tucker decomposition approach for tensor completion

被引:6
|
作者
Pan, Chenjian [1 ,2 ]
Ling, Chen [2 ]
He, Hongjin [1 ]
Qi, Liqun [3 ]
Xu, Yanwei [4 ]
机构
[1] Ningbo Univ, Sch Math & Stat, Ningbo 315211, Peoples R China
[2] Hangzhou Dianzi Univ, Sch Sci, Hangzhou 310018, Peoples R China
[3] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hong Kong, Peoples R China
[4] 2012 Labs Huawei Tech Investment Co Ltd, Future Network Theory Lab, Shatin, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensor completion; Tucker decomposition; Nuclear norm; Internet traffic data; Image inpainting; THRESHOLDING ALGORITHM; MATRIX FACTORIZATION; RECOVERY;
D O I
10.1016/j.amc.2023.128432
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion. Our model possesses a sparse regularization term to promote a sparse core of the Tucker decomposition, which is beneficial for tensor data compression. Moreover, we enforce low-rank regularization terms on factor matrices of the Tucker decomposition for inducing the low-rankness of the tensor with a cheap computational cost. Numerically, we propose a customized splitting method with easy subproblems to solve the underlying model. It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors. A series of computational experiments on real-world data sets, including internet traffic data sets and color images, demonstrate that our model performs better than many existing state-of-the-art matricization and tensorization approaches in terms of achieving higher recovery accuracy.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    INFORMATION SCIENCES, 2022, 616 : 303 - 329
  • [32] Optimality conditions for Tucker low-rank tensor optimization
    Luo, Ziyan
    Qi, Liqun
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 86 (03) : 1275 - 1298
  • [33] Tucker tensor decomposition with rank estimation for sparse hyperspectral unmixing
    Wu, Ling
    Huang, Jie
    Zhu, Zi-Yue
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2024, 45 (12) : 3992 - 4022
  • [34] Optimality conditions for Tucker low-rank tensor optimization
    Ziyan Luo
    Liqun Qi
    Computational Optimization and Applications, 2023, 86 : 1275 - 1298
  • [35] Multilinear Tensor Rank Estimation via Sparse Tucker Decomposition
    Yokota, Tatsuya
    Cichocki, Andrzej
    2014 JOINT 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 15TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2014, : 478 - 483
  • [36] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166
  • [37] Decomposition Approach for Low-Rank Matrix Completion and Its Applications
    Ma, Rick
    Barzigar, Nafise
    Roozgard, Aminmohammad
    Cheng, Samuel
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (07) : 1671 - 1683
  • [38] Enhanced Nonconvex Low-Rank Approximation of Tensor Multi-Modes for Tensor Completion
    Zeng, Haijin
    Chen, Yongyong
    Xie, Xiaozhen
    Ning, Jifeng
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2021, 7 : 164 - 177
  • [39] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [40] CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION
    Zhang, Anru
    ANNALS OF STATISTICS, 2019, 47 (02): : 936 - 964