Fast Low-Rank Matrix Learning with Nonconvex Regularization

被引:36
|
作者
Yao, Quanming [1 ]
Kwok, James T. [1 ]
Zhong, Wenliang [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong, Hong Kong, Peoples R China
关键词
Low-rank matrix; Nonconvex optimization; Proximal gradient; Matrix completion; Robust PCA; VARIABLE SELECTION; COMPLETION; RELAXATION; ALGORITHMS;
D O I
10.1109/ICDM.2015.9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank modeling has a lot of important applications in machine learning, computer vision and social network analysis. While the matrix rank is often approximated by the convex nuclear norm, the use of nonconvex low-rank regularizers has demonstrated better recovery performance. However, the resultant optimization problem is much more challenging. A very recent state-of-the-art is based on the proximal gradient algorithm. However, it requires an expensive full SVD in each proximal step. In this paper, we show that for many commonly-used nonconvex low-rank regularizers, a cutoff can be derived to automatically threshold the singular values obtained from the proximal operator. This allows the use of power method to approximate the SVD efficiently. Besides, the proximal operator can be reduced to that of a much smaller matrix projected onto this leading subspace. Convergence, with a rate of O(1/T) where T is the number of iterations, can be guaranteed. Extensive experiments are performed on matrix completion and robust principal component analysis. The proposed method achieves significant speedup over the state-of-the-art. Moreover, the matrix solution obtained is more accurate and has a lower rank than that of the traditional nuclear norm regularizer.
引用
收藏
页码:539 / 548
页数:10
相关论文
共 50 条
  • [31] Sparse and low-rank matrix regularization for learning time-varying Markov networks
    Hirayama, Jun-ichiro
    Hyvarinen, Aapo
    Ishii, Shin
    MACHINE LEARNING, 2016, 105 (03) : 335 - 366
  • [32] Low-Rank Structure Learning via Nonconvex Heuristic Recovery
    Deng, Yue
    Dai, Qionghai
    Liu, Risheng
    Zhang, Zengke
    Hu, Sanqing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (03) : 383 - 396
  • [33] A fast proximal iteratively reweighted nuclear norm algorithm for nonconvex low-rank matrix minimization problems
    Ge, Zhili
    Zhang, Xin
    Wu, Zhongming
    APPLIED NUMERICAL MATHEMATICS, 2022, 179 : 66 - 86
  • [34] Efficient Recovery of Low-Rank Matrix via Double Nonconvex Nonsmooth Rank Minimization
    Zhang, Hengmin
    Gong, Chen
    Qian, Jianjun
    Zhang, Bob
    Xu, Chunyan
    Yang, Jian
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (10) : 2916 - 2925
  • [35] ELASTIC-NET REGULARIZATION FOR LOW-RANK MATRIX RECOVERY
    Li, Hong
    Chen, Na
    Li, Luoqing
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2012, 10 (05)
  • [36] Bootstrap-Based Regularization for Low-Rank Matrix Estimation
    Josse, Julie
    Wager, Stefan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [37] Inducible regularization for low-rank matrix factorizations for collaborative filtering
    Zhang, Zhenyue
    Zhao, Keke
    Zha, Hongyuan
    NEUROCOMPUTING, 2012, 97 : 52 - 62
  • [38] Low-rank regularization in two-sided matrix regression
    Bettache, Nayel
    Butucea, Cristina
    ELECTRONIC JOURNAL OF STATISTICS, 2025, 19 (01): : 1174 - 1198
  • [39] Nuclear norm regularization with a low-rank constraint for matrix completion
    Zhang, Hui
    Cheng, Lizhi
    Zhu, Wei
    INVERSE PROBLEMS, 2010, 26 (11)
  • [40] A Survey on Nonconvex Regularization-Based Sparse and Low-Rank Recovery in Signal Processing, Statistics, and Machine Learning
    Wen, Fei
    Chu, Lei
    Liu, Peilin
    Qiu, Robert C.
    IEEE ACCESS, 2018, 6 : 69883 - 69906