Low-Rank Approximation of Structural Redundancy for Self-Supervised Learning

被引:0
|
作者
Du, Kang [1 ]
Xiang, Yu [1 ]
机构
[1] Univ Utah, Salt Lake City, UT 84112 USA
来源
关键词
Self-supervised learning; redundancy; low-rank approximation; ridge regression; BOUNDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study the data-generating mechanism for reconstructive SSL to shed light on its effectiveness. With an infinite amount of labeled samples, we provide a sufficient and necessary condition for perfect linear approximation. The condition reveals a full-rank component that preserves the label classes of Y, along with a redundant component. Motivated by the condition, we propose to approximate the redundant component by a low-rank factorization and measure the approximation quality by introducing a new quantity epsilon(s), parameterized by the rank of factorization s. We incorporate epsilon(s) into the excess risk analysis under both linear regression and ridge regression settings, where the latter regularization approach is to handle scenarios when the dimension of the learned features is much larger than the number of labeled samples n for downstream tasks. We design three stylized experiments to compare SSL with supervised learning under different settings to support our theoretical findings.
引用
收藏
页码:1008 / 1032
页数:25
相关论文
共 50 条
  • [31] Graph Regularized Low-Rank Representation for Semi-Supervised learning
    You, Cong-Zhe
    Wu, Xiao-Jun
    Palade, Vasile
    2018 17TH INTERNATIONAL SYMPOSIUM ON DISTRIBUTED COMPUTING AND APPLICATIONS FOR BUSINESS ENGINEERING AND SCIENCE (DCABES), 2018, : 92 - 95
  • [32] Graph regularized low-rank representation for semi-supervised learning
    You C.-Z.
    Shu Z.-Q.
    Fan H.-H.
    Wu X.-J.
    Journal of Algorithms and Computational Technology, 2021, 15
  • [33] Sparse Low-Rank and Graph Structure Learning for Supervised Feature Selection
    Wen, Guoqiu
    Zhu, Yonghua
    Zhan, Mengmeng
    Tan, Malong
    NEURAL PROCESSING LETTERS, 2020, 52 (03) : 1793 - 1809
  • [34] Supervised Low-Rank Embedded Regression (SERER) for Robust Subspace Learning
    Wan, Minghua
    Yao, Yu
    Zhan, Tianming
    Yang, Guowei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (04) : 1917 - 1927
  • [35] Sparse Low-Rank and Graph Structure Learning for Supervised Feature Selection
    Guoqiu Wen
    Yonghua Zhu
    Mengmeng Zhan
    Malong Tan
    Neural Processing Letters, 2020, 52 : 1793 - 1809
  • [36] Joint Embedding Learning and Low-Rank Approximation: A Framework for Incomplete Multiview Learning
    Tao, Hong
    Hou, Chenping
    Yi, Dongyun
    Zhu, Jubo
    Hu, Dewen
    IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (03) : 1690 - 1703
  • [37] Multiscale Decomposition in Low-Rank Approximation
    Abdolali, Maryam
    Rahmati, Mohammad
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (07) : 1015 - 1019
  • [38] SIMPLICIAL APPROXIMATION AND LOW-RANK TREES
    GILLET, H
    SHALEN, PB
    SKORA, RK
    COMMENTARII MATHEMATICI HELVETICI, 1991, 66 (04) : 521 - 540
  • [39] Enhanced Low-Rank Matrix Approximation
    Parekh, Ankit
    Selesnick, Ivan W.
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (04) : 493 - 497
  • [40] Modifiable low-rank approximation to a matrix
    Barlow, Jesse L.
    Erbay, Hasan
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2009, 16 (10) : 833 - 860