Fast Low-Rank Matrix Learning with Nonconvex Regularization

被引:36
|
作者
Yao, Quanming [1 ]
Kwok, James T. [1 ]
Zhong, Wenliang [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong, Hong Kong, Peoples R China
关键词
Low-rank matrix; Nonconvex optimization; Proximal gradient; Matrix completion; Robust PCA; VARIABLE SELECTION; COMPLETION; RELAXATION; ALGORITHMS;
D O I
10.1109/ICDM.2015.9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank modeling has a lot of important applications in machine learning, computer vision and social network analysis. While the matrix rank is often approximated by the convex nuclear norm, the use of nonconvex low-rank regularizers has demonstrated better recovery performance. However, the resultant optimization problem is much more challenging. A very recent state-of-the-art is based on the proximal gradient algorithm. However, it requires an expensive full SVD in each proximal step. In this paper, we show that for many commonly-used nonconvex low-rank regularizers, a cutoff can be derived to automatically threshold the singular values obtained from the proximal operator. This allows the use of power method to approximate the SVD efficiently. Besides, the proximal operator can be reduced to that of a much smaller matrix projected onto this leading subspace. Convergence, with a rate of O(1/T) where T is the number of iterations, can be guaranteed. Extensive experiments are performed on matrix completion and robust principal component analysis. The proposed method achieves significant speedup over the state-of-the-art. Moreover, the matrix solution obtained is more accurate and has a lower rank than that of the traditional nuclear norm regularizer.
引用
收藏
页码:539 / 548
页数:10
相关论文
共 50 条
  • [41] THE NONCONVEX GEOMETRY OF LOW-RANK MATRIX OPTIMIZATIONS WITH GENERAL OBJECTIVE FUNCTIONS
    Li, Qiuwei
    Tang, Gongguo
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1235 - 1239
  • [42] A nonconvex approach to low-rank matrix completion using convex optimization
    Lazzaro, Damiana
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2016, 23 (05) : 801 - 824
  • [43] A Unified Computational and Statistical Framework for Nonconvex Low-Rank Matrix Estimation
    Wang, Lingxiao
    Zhang, Xiao
    Gu, Quanquan
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 981 - 990
  • [44] Accelerated PALM for Nonconvex Low-Rank Matrix Recovery With Theoretical Analysis
    Zhang, Hengmin
    Wen, Bihan
    Zha, Zhiyuan
    Zhang, Bob
    Tang, Yang
    Yu, Guo
    Du, Wenli
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (04) : 2304 - 2317
  • [45] A Unified Framework for Nonconvex Low-Rank plus Sparse Matrix Recovery
    Zhang, Xiao
    Wang, Lingxiao
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [46] An Unsupervised Image Denoising Method Using a Nonconvex Low-Rank Model with TV Regularization
    Chen, Tianfei
    Xiang, Qinghua
    Zhao, Dongliang
    Sun, Lijun
    APPLIED SCIENCES-BASEL, 2023, 13 (12):
  • [47] Low-rank factorization for rank minimization with nonconvex regularizers
    April Sagan
    John E. Mitchell
    Computational Optimization and Applications, 2021, 79 : 273 - 300
  • [48] Low-rank factorization for rank minimization with nonconvex regularizers
    Sagan, April
    Mitchell, John E.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (02) : 273 - 300
  • [49] Learning Low-Rank Representation for Matrix Completion
    Kwon, Minsu
    Choi, Ho-Jin
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP 2020), 2020, : 161 - 164
  • [50] Fast Gradient Method for Low-Rank Matrix Estimation
    Hongyi Li
    Zhen Peng
    Chengwei Pan
    Di Zhao
    Journal of Scientific Computing, 2023, 96