Communication-Efficient Decentralized Sparse Bayesian Learning of Joint Sparse Signals

被引:16
|
作者
Khanna, Saurabh [1 ]
Murthy, Chandra R. [1 ]
机构
[1] Indian Inst Sci, Dept Elect Commun Engn, Bangalore 560012, Karnataka, India
关键词
Compressed sensing; distributed estimation; joint sparsity; sensor networks; sparse Bayesian learning; SENSOR NETWORKS; BANDWIDTH;
D O I
10.1109/TSIPN.2016.2632041
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider the problem of decentralized estimation of multiple joint sparse vectors by a network of nodes from locally acquired noisy and underdetermined linear measurements, when the cost of communication between the nodes is at a premium. We propose an iterative, decentralized Bayesian algorithm called fusion-based distributed sparse Bayesian learning (FB-DSBL) in which the nodes collaborate by exchanging highly compressed-messages to learn a common joint sparsity inducing signal prior. The learnt signal prior is subsequently used by each node to compute the maximum a posteriori probability estimate of its respective sparse vector. Since the internode communication cost is expensive, the size of the messages exchanged between nodes is reduced substantially by exchanging only those local signal prior parameters which are associated with the nonzero support detected via multiple composite log-likelihood ratio tests. The average message size is empirically shown to be proportional to the information rate of the unknown vectors. The proposed sparse Bayesian learning (SBL)-based distributed algorithm allows nodes to exploit the underlying joint sparsity of the signals. In turn, this enables the nodes to recover sparse vectors with significantly lower number of measurements compared to the standalone SBL algorithm. The proposed algorithm is interpreted as a degenerate case of a distributed consensus-based stochastic approximation algorithm for finding a fixed point of a function, and its generalized version with RobbinsMonro- type iterations is also developed. Using Monte Carlo simulations, we demonstrate that the proposed FB-DSBL has superior mean squared error and support recovery performance compared to the existing decentralized algorithms with similar or higher communication complexity.
引用
收藏
页码:617 / 630
页数:14
相关论文
共 50 条
  • [31] Communication-Efficient Network Topology in Decentralized Learning: A Joint Design of Consensus Matrix and Resource Allocation
    Wang, Jingrong
    Liang, Ben
    Zhu, Zhongwen
    Fapi, Emmanuel Thepie
    Dalal, Hardik
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024,
  • [32] Scaling Betweenness Centrality using Communication-Efficient Sparse Matrix Multiplication
    Solomonik, Edgar
    Besta, Maciej
    Vella, Flavio
    Hoefler, Torsten
    SC'17: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2017,
  • [33] Distributed and communication-efficient solutions to linear equations with special sparse structure
    Wang, Peng
    Gao, Yuanqi
    Yu, Nanpeng
    Ren, Wei
    Lian, Jianming
    Wu, Di
    SYSTEMS & CONTROL LETTERS, 2022, 160
  • [34] Pattern-Coupled Sparse Bayesian Learning for Recovery of Block-Sparse Signals
    Fang, Jun
    Shen, Yanning
    Li, Hongbin
    Wang, Pu
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (02) : 360 - 372
  • [35] Sparse Bayesian Learning With Dynamic Filtering for Inference of Time-Varying Sparse Signals
    O'Shaughnessy, Matthew R.
    Davenport, Mark A.
    Rozell, Christopher J.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (388-403) : 388 - 403
  • [36] Communication-efficient ADMM-based distributed algorithms for sparse training
    Wang, Guozheng
    Lei, Yongmei
    Qiu, Yongwen
    Lou, Lingfei
    Li, Yixin
    NEUROCOMPUTING, 2023, 550
  • [37] Joint Sparse Bayesian Learning for Channel Estimation in ISAC
    Chen, Kangjian
    Qi, Chenhao
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (08) : 1825 - 1829
  • [38] Communication-Efficient Learning of Deep Networks from Decentralized Data
    McMahan, H. Brendan
    Moore, Eider
    Ramage, Daniel
    Hampson, Seth
    Aguera y Arcas, Blaise
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 1273 - 1282
  • [39] Communication-Efficient Decentralized Learning with Sparsification and Adaptive Peer Selection
    Tang, Zhenheng
    Shi, Shaohuai
    Chu, Xiaowen
    2020 IEEE 40TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS), 2020, : 1207 - 1208
  • [40] Communication-efficient Decentralized Machine Learning over Heterogeneous Networks
    Zhou, Pan
    Lin, Qian
    Loghin, Dumitrel
    Ooi, Beng Chin
    Wu, Yuncheng
    Yu, Hongfang
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 384 - 395