A communication-efficient and privacy-aware distributed algorithm for sparse PCA

被引:0
|
作者
Lei Wang
Xin Liu
Yin Zhang
机构
[1] Academy of Mathematics and Systems Science,State Key Laboratory of Scientific and Engineering Computing
[2] University of Chinese Academy of Sciences,School of Mathematical Sciences
[3] The Chinese University of Hong Kong,School of Data Science
关键词
Alternating direction method of multipliers; Distributed computing; Optimization with orthogonality constraints; Sparse PCA;
D O I
暂无
中图分类号
学科分类号
摘要
Sparse principal component analysis (PCA) improves interpretability of the classic PCA by introducing sparsity into the dimension-reduction process. Optimization models for sparse PCA, however, are generally non-convex, non-smooth and more difficult to solve, especially on large-scale datasets requiring distributed computation over a wide network. In this paper, we develop a distributed and centralized algorithm called DSSAL1 for sparse PCA that aims to achieve low communication overheads by adapting a newly proposed subspace-splitting strategy to accelerate convergence. Theoretically, convergence to stationary points is established for DSSAL1. Extensive numerical results show that DSSAL1 requires far fewer rounds of communication than state-of-the-art peer methods. In addition, we make the case that since messages exchanged in DSSAL1 are well-masked, the possibility of private-data leakage in DSSAL1 is much lower than in some other distributed algorithms.
引用
收藏
页码:1033 / 1072
页数:39
相关论文
共 50 条
  • [1] A communication-efficient and privacy-aware distributed algorithm for sparse PCA
    Wang, Lei
    Liu, Xin
    Zhang, Yin
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 85 (03) : 1033 - 1072
  • [2] Communication-Efficient and Privacy-Aware Distributed LMS Algorithm
    Gogineni, Vinay Chakravarthi
    Moradi, Ashkan
    Venkategowda, Naveen K. D.
    Werner, Stefan
    2022 25TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2022), 2022,
  • [3] Communication-Efficient and Privacy-Aware Distributed Learning
    Gogineni, Vinay Chakravarthi
    Moradi, Ashkan
    Venkategowda, Naveen K. D.
    Werner, Stefan
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 705 - 720
  • [4] FAST AND COMMUNICATION-EFFICIENT DISTRIBUTED PCA
    Gang, Arpita
    Raja, Haroon
    Bajwa, Waheed U.
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7450 - 7454
  • [5] Communication-Efficient Distributed PCA by Riemannian Optimization
    Huang, Long-Kai
    Pan, Sinno Jialin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [6] An Uplink Communication-Efficient Approach to Featurewise Distributed Sparse Optimization With Differential Privacy
    Lou, Jian
    Cheung, Yiu-ming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) : 4529 - 4543
  • [7] Communication-efficient distributed covariance sketch, with application to distributed PCA
    Huang, Zengfeng
    Lin, Xuemin
    Zhang, Wenjie
    Zhang, Ying
    Journal of Machine Learning Research, 2021, 22
  • [8] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    INFORMATION SCIENCES, 2024, 668
  • [9] Communication-Efficient Distributed Covariance Sketch, with Application to Distributed PCA
    Huang, Zengfeng
    Lin, Xuemin
    Zhang, Wenjie
    Zhang, Ying
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [10] A Distributed Privacy-Aware Architecture for Communication in Smart Grids
    Callegari, Christian
    De Pietro, Sara
    Giordano, Stefano
    Pagano, Michele
    Procissi, Gregorio
    2013 IEEE 15TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS & 2013 IEEE INTERNATIONAL CONFERENCE ON EMBEDDED AND UBIQUITOUS COMPUTING (HPCC_EUC), 2013, : 1622 - 1627