Tensor train factorization under noisy and incomplete data with automatic rank estimation

被引:10
|
作者
Xu, Le [1 ]
Cheng, Lei [2 ]
Wong, Ngai [1 ]
Wu, Yik-Chung [1 ]
机构
[1] Univ Hong Kong, Dept Elect & Elect Engn, HKSAR, Hong Kong, Peoples R China
[2] Zhejiang Univ, Coll Informat Sci & Elect Engn, 388 Yuhangtao Rd, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Bayesian inference; Tensor completion; Tensor train; OPTIMIZATION; IMAGE;
D O I
10.1016/j.patcog.2023.109650
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a powerful tool in analyzing multi-dimensional data, tensor train (TT) decomposition shows superior performance compared to other tensor decomposition formats. Existing TT decomposition methods, however, either easily overfit with noise, or require substantial fine-tuning to strike a balance between recovery accuracy and model complexity. To avoid the above shortcomings, this paper treats the TT decomposition in a fully Bayesian perspective, which includes automatic TT rank determination and noise power estimation. Theoretical justification on adopting the Gaussian-product-Gamma priors for inducing sparsity on the slices of the TT cores is provided, thus allowing the model complexity to be automatically determined even when the observed tensor data is noisy and contains many missing values. Furthermore, using the variational inference framework, an effective learning algorithm on the probabilistic model parameters is derived. Simulations on synthetic data demonstrate that the proposed algorithm accurately recovers the underlying TT structure from incomplete noisy observations. Further experiments on image and video data also show its superior performance to other existing TT decomposition algorithms. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation from Incomplete Measurements
    Tong, Tian
    Ma, Cong
    Prater-Bennette, Ashley
    Tripp, Erin
    Chi, Yuejie
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [32] Efficient Tensor Completion Methods for 5-D Seismic Data Reconstruction: Low-Rank Tensor Train and Tensor Ring
    Liu, Dawei
    Sacchi, Mauricio D.
    Chen, Wenchao
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [34] Feature Extraction for Incomplete Data Via Low-Rank Tensor Decomposition With Feature Regularization
    Shi, Qiquan
    Cheung, Yiu-Ming
    Zhao, Qibin
    Lu, Haiping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (06) : 1803 - 1817
  • [35] 3-DIMENSIONAL RECONSTRUCTION FROM PROJECTIONS WITH INCOMPLETE AND NOISY DATA BY OBJECT ESTIMATION
    BRESLER, Y
    MACOVSKI, A
    IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1987, 35 (08): : 1139 - 1152
  • [36] A Low-Rank Tensor Train Approach for Electric Vehicle Load Data Reconstruction Using Real Industrial Data
    Sun, Bo
    Xu, Yijun
    Gu, Wei
    Cai, Huihuang
    Lu, Shuai
    Mili, Lamine
    Yu, Wenwu
    Wu, Zhi
    IEEE TRANSACTIONS ON SMART GRID, 2025, 16 (02) : 1911 - 1924
  • [37] Support Vector Machine based on Low-rank Tensor Train Decomposition for Big Data Applications
    Wang, Yongkang
    Zhang, Weicheng
    Yu, Zhuliang
    Gu, Zhenghui
    Liu, Hai
    Cai, Zhaoquan
    Wang, Congjun
    Gao, Shihan
    PROCEEDINGS OF THE 2017 12TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2017, : 850 - 853
  • [38] Big Data Matrix Singular Value Decomposition Based on Low-Rank Tensor Train Decomposition
    Lee, Namgil
    Cichocki, Andrzej
    ADVANCES IN NEURAL NETWORKS - ISNN 2014, 2014, 8866 : 121 - 130
  • [39] On the estimation of odds ratios under order restrictions with incomplete data
    Oluyede, BO
    DIMENSION REDUCTION, COMPUTATIONAL COMPLEXITY AND INFORMATION, 1998, 30 : 444 - 449
  • [40] Functional Transform-Based Low-Rank Tensor Factorization for Multi-dimensional Data Recovery
    Wang, Jianli
    Zhao, Xile
    COMPUTER VISION - ECCV 2024, PT XXXI, 2025, 15089 : 39 - 56