Fast optimization algorithm on Riemannian manifolds and its application in low-rank learning

被引:5
|
作者
Chen, Haoran [1 ]
Sun, Yanfeng [1 ]
Gao, Junbin [2 ]
Hu, Yongli [1 ]
Yin, Baocai [3 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing Key Lab Multimedia & Intelligent Software, Beijing 100124, Peoples R China
[2] Univ Sydney, Business Sch, Discipline Business Analyt, Sydney, NSW 2006, Australia
[3] Dalian Univ Technol, Fac Elect Informat & Elect Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Fast optimization algorithm; Riemannian manifolds; Low-rank matrix variety; Low-rank representation; Subspace pursuit; Augmented Lagrange method; Clustering;
D O I
10.1016/j.neucom.2018.02.058
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper proposes a first-order fast optimization algorithm on Riemannian manifolds (FOA) to address the problem of speeding up optimization algorithms for a class of composite functions on Riemannian manifolds. The theoretical analysis for FOA shows that the algorithm achieves the optimal rate of convergence for function values sequence. The experiments on the matrix completion task show that FOA has better performance than other existing first-order optimization methods on Riemannian manifolds. A subspace pursuit method (SP-RPRG(ALM)) based on FOA is also proposed to solve the low-rank representation model with the augmented Lagrange method (ALM) on the low-rank matrix variety. Experimental results on synthetic data and public databases are presented to demonstrate that both FOA and SP-RPRG (ALM) can achieve superior performance in terms of faster convergence and higher accuracy. We have made the experimental code public at https://github.com/Haoran2014. (c) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:59 / 70
页数:12
相关论文
共 50 条
  • [31] Learning Markov Models Via Low-Rank Optimization
    Zhu, Ziwei
    Li, Xudong
    Wang, Mengdi
    Zhang, Anru
    OPERATIONS RESEARCH, 2022, 70 (04) : 2384 - 2398
  • [32] A Fast Algorithm for Convolutional Structured Low-Rank Matrix Recovery
    Ongie, Gregory
    Jacob, Mathews
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2017, 3 (04) : 535 - 550
  • [33] Fast Randomized Singular Value Thresholding for Low-Rank Optimization
    Oh, Tae-Hyun
    Matsushita, Yasuyuki
    Tai, Yu-Wing
    Kweon, In So
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (02) : 376 - 391
  • [34] An algorithm for low-rank matrix factorization and its applications
    Chen, Baiyu
    Yang, Zi
    Yang, Zhouwang
    NEUROCOMPUTING, 2018, 275 : 1012 - 1020
  • [35] Low-rank and sparse matrices fitting algorithm for low-rank representation
    Zhao, Jianxi
    Zhao, Lina
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2020, 79 (02) : 407 - 425
  • [36] Discriminative Orthonormal Dictionary Learning for Fast Low-Rank Representation
    Dong, Zhen
    Pei, Mingtao
    Jia, Yunde
    NEURAL INFORMATION PROCESSING, PT I, 2015, 9489 : 79 - 89
  • [37] Fast Low-Rank Shared Dictionary Learning for Image Classification
    Vu, Tiep Huu
    Monga, Vishal
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (11) : 5160 - 5175
  • [38] A DEIM Tucker tensor cross algorithm and its application to dynamical low-rank approximation
    Ghahremani, Behzad
    Babaee, Hessam
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2024, 423
  • [39] Low-rank isomap algorithm
    Mehrbani, Eysan
    Kahaei, Mohammad Hossein
    IET SIGNAL PROCESSING, 2022, 16 (05) : 528 - 545
  • [40] A Riemannian rank-adaptive method for low-rank matrix completion
    Gao, Bin
    Absil, P-A
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 81 (01) : 67 - 90