Tensor train factorization under noisy and incomplete data with automatic rank estimation

被引:10
|
作者
Xu, Le [1 ]
Cheng, Lei [2 ]
Wong, Ngai [1 ]
Wu, Yik-Chung [1 ]
机构
[1] Univ Hong Kong, Dept Elect & Elect Engn, HKSAR, Hong Kong, Peoples R China
[2] Zhejiang Univ, Coll Informat Sci & Elect Engn, 388 Yuhangtao Rd, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Bayesian inference; Tensor completion; Tensor train; OPTIMIZATION; IMAGE;
D O I
10.1016/j.patcog.2023.109650
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a powerful tool in analyzing multi-dimensional data, tensor train (TT) decomposition shows superior performance compared to other tensor decomposition formats. Existing TT decomposition methods, however, either easily overfit with noise, or require substantial fine-tuning to strike a balance between recovery accuracy and model complexity. To avoid the above shortcomings, this paper treats the TT decomposition in a fully Bayesian perspective, which includes automatic TT rank determination and noise power estimation. Theoretical justification on adopting the Gaussian-product-Gamma priors for inducing sparsity on the slices of the TT cores is provided, thus allowing the model complexity to be automatically determined even when the observed tensor data is noisy and contains many missing values. Furthermore, using the variational inference framework, an effective learning algorithm on the probabilistic model parameters is derived. Simulations on synthetic data demonstrate that the proposed algorithm accurately recovers the underlying TT structure from incomplete noisy observations. Further experiments on image and video data also show its superior performance to other existing TT decomposition algorithms. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] PROBABILISTIC TENSOR TRAIN DECOMPOSITION WITH AUTOMATIC RANK DETERMINATION FROM NOISY DATA
    Xu, Le
    Cheng, Lei
    Wong, Ngai
    Wu, Yik-Chung
    2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 461 - 465
  • [2] Noise Adaptive Tensor Train Decomposition for Low-Rank Embedding of Noisy Data
    Li, Xinsheng
    Candan, K. Selcuk
    Sapino, Maria Luisa
    SIMILARITY SEARCH AND APPLICATIONS, SISAP 2020, 2020, 12440 : 203 - 217
  • [3] Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination
    Zhao, Qibin
    Zhang, Liqing
    Cichocki, Andrzej
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (09) : 1751 - 1763
  • [4] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Ding, Meng
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Yang, Jing-Hua
    JOURNAL OF SCIENTIFIC COMPUTING, 2019, 81 (02) : 941 - 964
  • [5] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Meng Ding
    Ting-Zhu Huang
    Teng-Yu Ji
    Xi-Le Zhao
    Jing-Hua Yang
    Journal of Scientific Computing, 2019, 81 : 941 - 964
  • [6] Adaptive Rank Estimation Based Tensor Factorization Algorithm for Low-Rank Tensor Completion
    Liu, Han
    Liu, Jing
    Su, Liyu
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 3444 - 3449
  • [7] TENSOR QUANTILE REGRESSION WITH LOW-RANK TENSOR TRAIN ESTIMATION
    Liu, Zihuan
    Lee, Cheuk Yin
    Zhang, Heping
    ANNALS OF APPLIED STATISTICS, 2024, 18 (02): : 1294 - 1318
  • [8] Bayesian Robust Tensor Factorization for Incomplete Multiway Data
    Zhao, Qibin
    Zhou, Guoxu
    Zhang, Liqing
    Cichocki, Andrzej
    Amari, Shun-Ichi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) : 736 - 748
  • [9] Tensor Train Factorization with Spatio-temporal Smoothness for Streaming Low-rank Tensor Completion
    Yu, Gaohang
    Wan, Shaochun
    Ling, Chen
    Qi, Liqun
    Xu, Yanwei
    FRONTIERS OF MATHEMATICS, 2024, 19 (05): : 933 - 959
  • [10] Low-Rank Tensor Train Coefficient Array Estimation for Tensor-on-Tensor Regression
    Liu, Yipeng
    Liu, Jiani
    Zhu, Ce
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (12) : 5402 - 5411