Fast optimization algorithm on Riemannian manifolds and its application in low-rank learning

被引:5
|
作者
Chen, Haoran [1 ]
Sun, Yanfeng [1 ]
Gao, Junbin [2 ]
Hu, Yongli [1 ]
Yin, Baocai [3 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing Key Lab Multimedia & Intelligent Software, Beijing 100124, Peoples R China
[2] Univ Sydney, Business Sch, Discipline Business Analyt, Sydney, NSW 2006, Australia
[3] Dalian Univ Technol, Fac Elect Informat & Elect Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Fast optimization algorithm; Riemannian manifolds; Low-rank matrix variety; Low-rank representation; Subspace pursuit; Augmented Lagrange method; Clustering;
D O I
10.1016/j.neucom.2018.02.058
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper proposes a first-order fast optimization algorithm on Riemannian manifolds (FOA) to address the problem of speeding up optimization algorithms for a class of composite functions on Riemannian manifolds. The theoretical analysis for FOA shows that the algorithm achieves the optimal rate of convergence for function values sequence. The experiments on the matrix completion task show that FOA has better performance than other existing first-order optimization methods on Riemannian manifolds. A subspace pursuit method (SP-RPRG(ALM)) based on FOA is also proposed to solve the low-rank representation model with the augmented Lagrange method (ALM) on the low-rank matrix variety. Experimental results on synthetic data and public databases are presented to demonstrate that both FOA and SP-RPRG (ALM) can achieve superior performance in terms of faster convergence and higher accuracy. We have made the experimental code public at https://github.com/Haoran2014. (c) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:59 / 70
页数:12
相关论文
共 50 条
  • [21] Fast Recursive Low-rank Tensor Learning for Regression
    Hou, Ming
    Chaib-draa, Brahim
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1851 - 1857
  • [22] Accurate and fast matrix factorization for low-rank learning
    Godaz, Reza
    Monsefi, Reza
    Toutounian, Faezeh
    Hosseini, Reshad
    JOURNAL OF MATHEMATICAL MODELING, 2022, 10 (02): : 263 - 278
  • [23] Fast Low-Rank Matrix Learning with Nonconvex Regularization
    Yao, Quanming
    Kwok, James T.
    Zhong, Wenliang
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 539 - 548
  • [24] Learning Fast Low-Rank Projection for Image Classification
    Li, Jun
    Kong, Yu
    Zhao, Handong
    Yang, Jian
    Fu, Yun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2016, 25 (10) : 4803 - 4814
  • [25] A normal fan projection algorithm for low-rank optimization
    Scott, James R.
    Geunes, Joseph
    MATHEMATICAL PROGRAMMING, 2025, 209 (1-2) : 681 - 702
  • [26] Low-Rank Optimization Algorithm for Accelerated Dynamic MRI
    Yang Min
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 1956 - 1960
  • [27] Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds
    Zhang, Hongyi
    Reddi, Sashank J.
    Sra, Suvrit
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [28] SOLVING PHASELIFT BY LOW-RANK RIEMANNIAN OPTIMIZATION METHODS FOR COMPLEX SEMIDEFINITE CONSTRAINTS
    Huang, Wen
    Gallivan, K. A.
    Zhang, Xiangxiong
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2017, 39 (05): : B840 - B859
  • [29] PRECONDITIONED LOW-RANK RIEMANNIAN OPTIMIZATION FOR LINEAR SYSTEMS WITH TENSOR PRODUCT STRUCTURE
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (04): : A2018 - A2044
  • [30] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384