Generalized Low-Rank Approximations of Matrices Revisited

被引:28
|
作者
Liu, Jun [1 ]
Chen, Songcan [1 ]
Zhou, Zhi-Hua [2 ]
Tan, Xiaoyang [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Dept Comp Sci & Engn, Nanjing 210016, Peoples R China
[2] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing 210093, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 04期
基金
美国国家科学基金会;
关键词
Dimensionality reduction; singular value decomposition (SVD); generalized low-rank approximations of matrices (GLRAM); reconstruction error; REPRESENTATION;
D O I
10.1109/TNN.2010.2040290
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Compared to singular value decomposition (SVD), generalized low-rank approximations of matrices (GLRAM) can consume less computation time, obtain higher compression ratio, and yield competitive classification performance. GLRAM has been successfully applied to applications such as image compression and retrieval, and quite a few extensions have been successively proposed. However, in literature, some basic properties and crucial problems with regard to GLRAM have not been explored or solved yet. For this sake, we revisit GLRAM in this paper. First, we reveal such a close relationship between GLRAM and SVD that GLRAM's objective function is identical to SVD's objective function except the imposed constraints. Second, we derive a lower bound of GLRAM's objective function, and discuss when the lower bound can be touched. Moreover, from the viewpoint of minimizing the lower bound, we answer one open problem raised by Ye (Machine Learning, 2005), i.e., a theoretical justification of the experimental phenomenon that, under given number of reduced dimension, the lowest reconstruction error is obtained when the left and right transformations have equal number of columns. Third, we explore when and why GLRAM can perform well in terms of compression, which is a fundamental problem concerning the usability of GLRAM.
引用
收藏
页码:621 / 632
页数:12
相关论文
共 50 条
  • [1] An analytical algorithm for generalized low-rank approximations of matrices
    Liang, ZZ
    Shi, PF
    PATTERN RECOGNITION, 2005, 38 (11) : 2213 - 2216
  • [2] Comments on "An analytical algorithm for generalized low-rank approximations of matrices"
    Hu, Yafeng
    Lv, Hairong
    Zhang, Xianda
    PATTERN RECOGNITION, 2008, 41 (06) : 2133 - 2135
  • [3] Generalized Low Rank Approximations of Matrices
    Jieping Ye
    Machine Learning, 2005, 61 : 167 - 191
  • [4] Generalized low rank approximations of matrices
    Ye, JP
    MACHINE LEARNING, 2005, 61 (1-3) : 167 - 191
  • [5] Robust Generalized Low Rank Approximations of Matrices
    Shi, Jiarong
    Yang, Wei
    Zheng, Xiuyun
    PLOS ONE, 2015, 10 (09):
  • [6] SYMMETRIC GENERALIZED LOW RANK APPROXIMATIONS OF MATRICES
    Inoue, Kohei
    Hara, Kenji
    Urahama, Kiichi
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 949 - 952
  • [7] DEFECT INSPECTION FOR STRUCTURAL TEXTURE SURFACE BASED ON GENERALIZED LOW-RANK APPROXIMATIONS OF MATRICES
    Wang, Yan-Xing
    Cen, Yi-Gang
    Liang, Lie-Quan
    Zeng, Ming
    Mladenovic, Vladimir
    2017 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS (ISPACS 2017), 2017, : 486 - 490
  • [8] Generalized low-rank approximations of matrices with missing components and its applications in image processing
    Li, Lu
    Dong, Qiulei
    Zhao, Ruizhen
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2015, 27 (11): : 2065 - 2076
  • [9] A compact Heart iteration for low-rank approximations of large matrices
    Dax, Achiya
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2024, 437
  • [10] Singular Values of Dual Quaternion Matrices and Their Low-Rank Approximations
    Ling, Chen
    He, Hongjin
    Qi, Liqun
    NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 2022, 43 (12) : 1423 - 1458