Distributed sparsity constrained optimization over the Stiefel manifold

被引:0
|
作者
Qu, Wentao [1 ]
Chen, Huangyue [2 ,3 ]
Xiu, Xianchao [4 ]
Liu, Wanquan [5 ]
机构
[1] Beijing Jiaotong Univ, Sch Math & Stat, Beijing 100044, Peoples R China
[2] Guangxi Univ, Sch Math & Informat Sci, Nanning 530004, Peoples R China
[3] Chinese Acad Sci, Inst Appl Math, Acad Math & Syst Sci, Beijing 100190, Peoples R China
[4] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai 200444, Peoples R China
[5] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Shenzhen 518107, Peoples R China
基金
中国国家自然科学基金;
关键词
Distributed optimization; Sparsity constrained optimization; Stiefel manifold; Newton method; CONSENSUS;
D O I
10.1016/j.neucom.2024.128267
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distributed optimization aims to effectively complete specified tasks through cooperation among multi- agent systems, which has achieved great success in large-scale optimization problems. However, it remains a challenging task to develop an effective distributed algorithm with theoretical guarantees, especially when dealing with nonconvex constraints. More importantly, high-dimensional data often exhibits inherent structures such as sparsity, which if exploited accurately, can significantly enhance the capture of its intrinsic characteristics. In this paper, we introduce a novel distributed sparsity constrained optimization framework over the Stiefel manifold, abbreviated as DREAM. DREAM innovatively integrates the t 2 , 0-norm constraint and Stiefel manifold constraint within a distributed optimization setting, which has not been investigated in existing literature. Unlike the existing distributed methods, the proposed DREAM not only can extract the similarity information among samples, but also more flexibly determine the number of features to be extracted. Then, we develop an efficient Newton augmented Lagrangian-based algorithm. In theory, we delve into the relationship between the minimizer, the Karush-Kuhn-Tucker point, and the stationary point, and rigorously demonstrate that the sequence generated by our algorithm converges to a stationary point. Extensive numerical experiments verify its superiority over state-of-the-art distributed methods.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] A constraint dissolving approach for nonsmooth optimization over the Stiefel manifold
    Hu, Xiaoyin
    Xiao, Nachuan
    Liu, Xin
    Toh, Kim-Chuan
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2023, 44 (06) : 3717 - 3748
  • [2] Multipliers Correction Methods for Optimization Problems over the Stiefel Manifold
    Wang, Lei
    Gao, Bin
    Liu, Xin
    CSIAM TRANSACTIONS ON APPLIED MATHEMATICS, 2021, 2 (03): : 508 - 531
  • [3] Decentralized Non-Smooth Optimization Over the Stiefel Manifold
    Wang, Jinxin
    Hu, Jiang
    Chen, Shixiang
    Deng, Zengde
    So, Anthony Man-Cho
    2024 IEEE 13RD SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP, SAM 2024, 2024,
  • [4] A Strengthened SDP Relaxation for Quadratic Optimization Over the Stiefel Manifold
    Burer, Samuel
    Park, Kyungchan
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 202 (01) : 320 - 339
  • [5] PROXIMAL GRADIENT METHOD FOR NONSMOOTH OPTIMIZATION OVER THE STIEFEL MANIFOLD
    Chen, Shixiang
    Ma, Shiqian
    So, Anthony Man-Cho
    Zhang, Tong
    SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 210 - 239
  • [6] Interpretable domain adaptation via optimization over the Stiefel manifold
    Poelitz, Christian
    Duivesteijn, Wouter
    Morik, Katharina
    MACHINE LEARNING, 2016, 104 (2-3) : 315 - 336
  • [7] Interpretable domain adaptation via optimization over the Stiefel manifold
    Christian Pölitz
    Wouter Duivesteijn
    Katharina Morik
    Machine Learning, 2016, 104 : 315 - 336
  • [8] Two adaptive scaled gradient projection methods for Stiefel manifold constrained optimization
    Oviedo, Harry
    Dalmau, Oscar
    Lara, Hugo
    NUMERICAL ALGORITHMS, 2021, 87 (03) : 1107 - 1127
  • [9] Two adaptive scaled gradient projection methods for Stiefel manifold constrained optimization
    Harry Oviedo
    Oscar Dalmau
    Hugo Lara
    Numerical Algorithms, 2021, 87 : 1107 - 1127
  • [10] Decentralized Optimization Over the Stiefel Manifold by an Approximate Augmented Lagrangian Function
    Wang, Lei
    Liu, Xin
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 3029 - 3041