Fundamental limits and algorithms for sparse linear regression with sublinear sparsity

被引:0
|
作者
Truong, Lan V. [1 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
关键词
Bayesian Inference; Approximate Message Passing; Replica Method; Inter-polation Method; SUPPORT RECOVERY; INFORMATION; CDMA;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We establish exact asymptotic expressions for the normalized mutual information and min-imum mean-square-error (MMSE) of sparse linear regression in the sub-linear sparsity regime. Our result is achieved by a generalization of the adaptive interpolation method in Bayesian inference for linear regimes to sub-linear ones. A modification of the well-known approximate message passing algorithm to approach the MMSE fundamental limit is also proposed, and its state evolution is rigorously analysed. Our results show that the tra-ditional linear assumption between the signal dimension and number of observations in the replica and adaptive interpolation methods is not necessary for sparse signals. They also show how to modify the existing well-known AMP algorithms for linear regimes to sub-linear ones.
引用
收藏
页数:49
相关论文
共 50 条
  • [31] Linear Algorithms in Sublinear Time-a Tutorial on Statistical Estimation
    Ullrich, Torsten
    Fellner, Dieter W.
    IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2011, 31 (02) : 58 - 66
  • [32] lq Sparsity Penalized Linear Regression With Cyclic Descent
    Marjanovic, Goran
    Solo, Victor
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (06) : 1464 - 1475
  • [33] BAYESIAN LINEAR REGRESSION WITH SPARSE PRIORS
    Castillo, Ismael
    Schmidt-Hieber, Johannes
    Van der Vaart, Aad
    ANNALS OF STATISTICS, 2015, 43 (05): : 1986 - 2018
  • [34] Linear regression with a sparse parameter vector
    Larsson, Erik G.
    Selen, Yngve
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2007, 55 (02) : 451 - 460
  • [35] Feature Adaptation for Sparse Linear Regression
    Kelner, Jonathan A.
    Koehler, Frederic
    Meka, Raghu
    Rohatgi, Dhruv
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [36] On the Power of Preconditioning in Sparse Linear Regression
    Kelner, Jonathan A.
    Koehler, Frederic
    Meka, Raghu
    Rohatgi, Dhruv
    2021 IEEE 62ND ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS 2021), 2022, : 550 - 561
  • [37] Sparse estimation in functional linear regression
    Lee, Eun Ryung
    Park, Byeong U.
    JOURNAL OF MULTIVARIATE ANALYSIS, 2012, 105 (01) : 1 - 17
  • [38] Linear regression with a sparse parameter vector
    Larsson, Erik G.
    Selen, Yngve
    2006 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-13, 2006, : 2760 - 2763
  • [39] Algorithms for Sparse k-Monotone Regression
    Sidorov, Sergei P.
    Faizliev, Alexey R.
    Gudkov, Alexander A.
    Mironov, Sergei, V
    INTEGRATION OF CONSTRAINT PROGRAMMING, ARTIFICIAL INTELLIGENCE, AND OPERATIONS RESEARCH, CPAIOR 2018, 2018, 10848 : 546 - 556
  • [40] Fundamental limits for cooling of linear quantum refrigerators
    Freitas, Nahuel
    Paz, Juan Pablo
    PHYSICAL REVIEW E, 2017, 95 (01)