Seeking Consensus on Subspaces in Federated Principal Component Analysis

被引:0
|
作者
Wang, Lei [1 ]
Liu, Xin [2 ,3 ]
Zhang, Yin [4 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
[2] Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Beijing, Peoples R China
[4] Chinese Univ Hong Kong, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Alternating direction method of multipliers; Federated learning; Principal component analysis; Orthogonality constraints; SIMULTANEOUS-ITERATION; OPTIMIZATION PROBLEMS; FRAMEWORK; ALGORITHM; SVD;
D O I
10.1007/s10957-024-02523-1
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we develop an algorithm for federated principal component analysis (PCA) with emphases on both communication efficiency and data privacy. Generally speaking, federated PCA algorithms based on direct adaptations of classic iterative methods, such as simultaneous subspace iterations, are unable to preserve data privacy, while algorithms based on variable-splitting and consensus-seeking, such as alternating direction methods of multipliers (ADMM), lack in communication-efficiency. In this work, we propose a novel consensus-seeking formulation by equalizing subspaces spanned by splitting variables instead of equalizing variables themselves, thus greatly relaxing feasibility restrictions and allowing much faster convergence. Then we develop an ADMM-like algorithm with several special features to make it practically efficient, including a low-rank multiplier formula and techniques for treating subproblems. We establish that the proposed algorithm can better protect data privacy than classic methods adapted to the federated PCA setting. We derive convergence results, including a worst-case complexity estimate, for the proposed ADMM-like algorithm in the presence of the nonlinear equality constraints. Extensive empirical results are presented to show that the new algorithm, while enhancing data privacy, requires far fewer rounds of communication than existing peer algorithms for federated PCA.
引用
收藏
页码:529 / 561
页数:33
相关论文
共 50 条
  • [31] PRINCIPAL COMPONENT ANALYSIS
    ARIES, RE
    LIDIARD, DP
    SPRAGG, RA
    CHEMISTRY IN BRITAIN, 1991, 27 (09) : 821 - 824
  • [32] Segmented principal component transform-principal component analysis
    Barros, AS
    Rutledge, DN
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2005, 78 (1-2) : 125 - 137
  • [33] A nonparametric statistical comparison of principal component and linear discriminant subspaces for face recognition
    Beveridge, JR
    She, K
    Draper, BA
    Givens, GH
    2001 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2001, : 535 - 542
  • [34] Probabilistic principal component subspaces: A hierarchical finite mixture model for data visualization
    Wang, Y
    Luo, L
    Freedman, MT
    Kung, SY
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (03): : 625 - 636
  • [35] Weighted chi-squared tests for partial common principal component subspaces
    Schott, JR
    BIOMETRIKA, 2003, 90 (02) : 411 - 421
  • [36] On Principal Invariant Subspaces
    Xiao Ming XU
    Xiao Chun FANG
    Acta Mathematica Sinica, 2015, 31 (10) : 1621 - 1628
  • [37] On Principal Invariant Subspaces
    Xu, Xiao Ming
    Fang, Xiao Chun
    ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2015, 31 (10) : 1621 - 1628
  • [38] On Principal Invariant Subspaces
    Xiao Ming XU
    Xiao Chun FANG
    Acta Mathematica Sinica,English Series, 2015, (10) : 1621 - 1628
  • [39] On principal invariant subspaces
    Xiao Ming Xu
    Xiao Chun Fang
    Acta Mathematica Sinica, English Series, 2015, 31 : 1621 - 1628
  • [40] Anomaly Detection Based on Kernel Principal Component and Principal Component Analysis
    Wang, Wei
    Zhang, Min
    Wang, Dan
    Jiang, Yu
    Li, Yuliang
    Wu, Hongda
    COMMUNICATIONS, SIGNAL PROCESSING, AND SYSTEMS, 2019, 463 : 2222 - 2228