Fast optimization algorithm on Riemannian manifolds and its application in low-rank learning

被引:5
|
作者
Chen, Haoran [1 ]
Sun, Yanfeng [1 ]
Gao, Junbin [2 ]
Hu, Yongli [1 ]
Yin, Baocai [3 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing Key Lab Multimedia & Intelligent Software, Beijing 100124, Peoples R China
[2] Univ Sydney, Business Sch, Discipline Business Analyt, Sydney, NSW 2006, Australia
[3] Dalian Univ Technol, Fac Elect Informat & Elect Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Fast optimization algorithm; Riemannian manifolds; Low-rank matrix variety; Low-rank representation; Subspace pursuit; Augmented Lagrange method; Clustering;
D O I
10.1016/j.neucom.2018.02.058
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper proposes a first-order fast optimization algorithm on Riemannian manifolds (FOA) to address the problem of speeding up optimization algorithms for a class of composite functions on Riemannian manifolds. The theoretical analysis for FOA shows that the algorithm achieves the optimal rate of convergence for function values sequence. The experiments on the matrix completion task show that FOA has better performance than other existing first-order optimization methods on Riemannian manifolds. A subspace pursuit method (SP-RPRG(ALM)) based on FOA is also proposed to solve the low-rank representation model with the augmented Lagrange method (ALM) on the low-rank matrix variety. Experimental results on synthetic data and public databases are presented to demonstrate that both FOA and SP-RPRG (ALM) can achieve superior performance in terms of faster convergence and higher accuracy. We have made the experimental code public at https://github.com/Haoran2014. (c) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:59 / 70
页数:12
相关论文
共 50 条
  • [41] A Riemannian rank-adaptive method for low-rank matrix completion
    Bin Gao
    P.-A. Absil
    Computational Optimization and Applications, 2022, 81 : 67 - 90
  • [42] Nonlinear low-rank representation on Stiefel manifolds
    Yin, Ming
    Gao, Junbin
    Guo, Yi
    ELECTRONICS LETTERS, 2015, 51 (10) : 749 - 750
  • [43] Low-Rank Riemannian Optimization on Positive Semidefinite Stochastic Matrices with Applications to Graph Clustering
    Douik, Ahmed
    Hassibi, Babak
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [44] A Riemannian Framework for Low-Rank Structured Elliptical Models
    Bouchard, Florent
    Breloy, Arnaud
    Ginolhac, Guillaume
    Renaux, Alexandre
    Pascal, Frederic
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 (69) : 1185 - 1199
  • [45] Riemannian Submanifold Tracking on Low-Rank Algebraic Variety
    Li, Qian
    Wang, Zhichao
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2196 - 2202
  • [46] A Fast Automatic Low-rank Determination Algorithm for Noisy Matrix Completion
    Yokota, Tatsuya
    Cichocki, Andrzej
    2015 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA), 2015, : 43 - 46
  • [47] RIEMANNIAN MULTIGRID LINE SEARCH FOR LOW-RANK PROBLEMS
    Sutti, Marco
    Vandereycken, Bart
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2021, 43 (03): : A1803 - A1831
  • [48] Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
    Zhang, Jie
    Zhang, Lijun
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 933 - 939
  • [49] Riemannian Low-Rank Model Compression for Federated Learning With Over-the-Air Aggregation
    Xue, Ye
    Lau, Vincent
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 2172 - 2187
  • [50] Low-Rank Extragradient Method for Nonsmooth and Low-Rank Matrix Optimization Problems
    Garber, Dan
    Kaplan, Atara
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34