Generalized Low-Rank Approximations of Matrices Revisited

被引:28
|
作者
Liu, Jun [1 ]
Chen, Songcan [1 ]
Zhou, Zhi-Hua [2 ]
Tan, Xiaoyang [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Dept Comp Sci & Engn, Nanjing 210016, Peoples R China
[2] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing 210093, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 04期
基金
美国国家科学基金会;
关键词
Dimensionality reduction; singular value decomposition (SVD); generalized low-rank approximations of matrices (GLRAM); reconstruction error; REPRESENTATION;
D O I
10.1109/TNN.2010.2040290
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Compared to singular value decomposition (SVD), generalized low-rank approximations of matrices (GLRAM) can consume less computation time, obtain higher compression ratio, and yield competitive classification performance. GLRAM has been successfully applied to applications such as image compression and retrieval, and quite a few extensions have been successively proposed. However, in literature, some basic properties and crucial problems with regard to GLRAM have not been explored or solved yet. For this sake, we revisit GLRAM in this paper. First, we reveal such a close relationship between GLRAM and SVD that GLRAM's objective function is identical to SVD's objective function except the imposed constraints. Second, we derive a lower bound of GLRAM's objective function, and discuss when the lower bound can be touched. Moreover, from the viewpoint of minimizing the lower bound, we answer one open problem raised by Ye (Machine Learning, 2005), i.e., a theoretical justification of the experimental phenomenon that, under given number of reduced dimension, the lowest reconstruction error is obtained when the left and right transformations have equal number of columns. Third, we explore when and why GLRAM can perform well in terms of compression, which is a fundamental problem concerning the usability of GLRAM.
引用
收藏
页码:621 / 632
页数:12
相关论文
共 50 条
  • [21] LOW-RANK APPROXIMATIONS FOR DYNAMIC IMAGING
    Haldar, Justin P.
    Liang, Zhi-Pei
    2011 8TH IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING: FROM NANO TO MACRO, 2011, : 1052 - 1055
  • [22] ALORA: Affine Low-Rank Approximations
    Ayala, Alan
    Claeys, Xavier
    Grigori, Laura
    JOURNAL OF SCIENTIFIC COMPUTING, 2019, 79 (02) : 1135 - 1160
  • [23] The geometry of weighted low-rank approximations
    Manton, JH
    Mahony, R
    Hua, YB
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2003, 51 (02) : 500 - 514
  • [24] Generalized low-rank approximation of matrices based on multiple transformation pairs
    Ahmadi, Soheil
    Rezghi, Mansoor
    PATTERN RECOGNITION, 2020, 108 (108)
  • [25] Mixed precision low-rank approximations and their application to block low-rank LU factorization
    Amestoy, Patrick
    Boiteau, Olivier
    Buttari, Alfredo
    Gerest, Matthieu
    Jezequel, Fabienne
    L'excellent, Jean-Yves
    Mary, Theo
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2023, 43 (04) : 2198 - 2227
  • [26] REDUCED BASIS METHODS: FROM LOW-RANK MATRICES TO LOW-RANK TENSORS
    Ballani, Jonas
    Kressner, Daniel
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (04): : A2045 - A2067
  • [27] Matrices with Hierarchical Low-Rank Structures
    Ballani, Jonas
    Kressner, Daniel
    EXPLOITING HIDDEN STRUCTURE IN MATRIX COMPUTATIONS: ALGORITHMS AND APPLICATIONS, 2016, 2173 : 161 - 209
  • [28] Correlation Clustering with Low-Rank Matrices
    Veldt, Nate
    Wirt, Anthony
    Gleic, David F.
    PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'17), 2017, : 1025 - 1034
  • [29] Decentralized sketching of low-rank matrices
    Srinivasa, Rakshith S.
    Lee, Kiryung
    Junge, Marius
    Romberg, Justin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [30] STATISTICAL RANK SELECTION FOR INCOMPLETE LOW-RANK MATRICES
    Zhang, Rui
    Shapiro, Alexander
    Xie, Yao
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2912 - 2916