Seeking Consensus on Subspaces in Federated Principal Component Analysis

被引:0
|
作者
Wang, Lei [1 ]
Liu, Xin [2 ,3 ]
Zhang, Yin [4 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
[2] Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Beijing, Peoples R China
[4] Chinese Univ Hong Kong, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Alternating direction method of multipliers; Federated learning; Principal component analysis; Orthogonality constraints; SIMULTANEOUS-ITERATION; OPTIMIZATION PROBLEMS; FRAMEWORK; ALGORITHM; SVD;
D O I
10.1007/s10957-024-02523-1
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we develop an algorithm for federated principal component analysis (PCA) with emphases on both communication efficiency and data privacy. Generally speaking, federated PCA algorithms based on direct adaptations of classic iterative methods, such as simultaneous subspace iterations, are unable to preserve data privacy, while algorithms based on variable-splitting and consensus-seeking, such as alternating direction methods of multipliers (ADMM), lack in communication-efficiency. In this work, we propose a novel consensus-seeking formulation by equalizing subspaces spanned by splitting variables instead of equalizing variables themselves, thus greatly relaxing feasibility restrictions and allowing much faster convergence. Then we develop an ADMM-like algorithm with several special features to make it practically efficient, including a low-rank multiplier formula and techniques for treating subproblems. We establish that the proposed algorithm can better protect data privacy than classic methods adapted to the federated PCA setting. We derive convergence results, including a worst-case complexity estimate, for the proposed ADMM-like algorithm in the presence of the nonlinear equality constraints. Extensive empirical results are presented to show that the new algorithm, while enhancing data privacy, requires far fewer rounds of communication than existing peer algorithms for federated PCA.
引用
收藏
页码:529 / 561
页数:33
相关论文
共 50 条
  • [21] Principal Component Projection Without Principal Component Analysis
    Frostig, Roy
    Musco, Cameron
    Musco, Christopher
    Sidford, Aaron
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [22] Accelerating Wireless Federated Learning via Nesterov's Momentum and Distributed Principal Component Analysis
    Dong, Yanjie
    Wang, Luya
    Wang, Jia
    Hu, Xiping
    Zhang, Haijun
    Yu, Fei Richard
    Leung, Victor C. M.
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (06) : 5938 - 5952
  • [23] Principal component analysis
    Michael Greenacre
    Patrick J. F. Groenen
    Trevor Hastie
    Alfonso Iodice D’Enza
    Angelos Markos
    Elena Tuzhilina
    Nature Reviews Methods Primers, 2
  • [24] Principal component analysis
    Wallen, Hayley
    NATURE REVIEWS METHODS PRIMERS, 2022, 2 (01):
  • [25] Principal component analysis
    Bro, Rasmus
    Smilde, Age K.
    ANALYTICAL METHODS, 2014, 6 (09) : 2812 - 2831
  • [26] Principal component analysis
    Jake Lever
    Martin Krzywinski
    Naomi Altman
    Nature Methods, 2017, 14 : 641 - 642
  • [27] Principal component analysis
    Abdi, Herve
    Williams, Lynne J.
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2010, 2 (04): : 433 - 459
  • [28] Principal component analysis
    School of Behavioral and Brain Sciences, University of Texas at Dallas, MS: GR4.1, Richardson, TX 75080-3021, United States
    不详
    Wiley Interdiscip. Rev. Comput. Stat., 4 (433-459):
  • [29] PRINCIPAL COMPONENT ANALYSIS
    WOLD, S
    ESBENSEN, K
    GELADI, P
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 1987, 2 (1-3) : 37 - 52
  • [30] Principal component analysis
    Hess, Aaron S.
    Hess, John R.
    TRANSFUSION, 2018, 58 (07) : 1580 - 1582