Fundamental limits and algorithms for sparse linear regression with sublinear sparsity

被引:0
|
作者
Truong, Lan V. [1 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
关键词
Bayesian Inference; Approximate Message Passing; Replica Method; Inter-polation Method; SUPPORT RECOVERY; INFORMATION; CDMA;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We establish exact asymptotic expressions for the normalized mutual information and min-imum mean-square-error (MMSE) of sparse linear regression in the sub-linear sparsity regime. Our result is achieved by a generalization of the adaptive interpolation method in Bayesian inference for linear regimes to sub-linear ones. A modification of the well-known approximate message passing algorithm to approach the MMSE fundamental limit is also proposed, and its state evolution is rigorously analysed. Our results show that the tra-ditional linear assumption between the signal dimension and number of observations in the replica and adaptive interpolation methods is not necessary for sparse signals. They also show how to modify the existing well-known AMP algorithms for linear regimes to sub-linear ones.
引用
收藏
页数:49
相关论文
共 50 条
  • [1] Efficient Sublinear-Regret Algorithms for Online Sparse Linear Regression with Limited Observation
    Ito, Shinji
    Hatano, Daisuke
    Sumita, Hanna
    Yabe, Akihiro
    Fukunaga, Takuro
    Kakimura, Naonori
    Kawarabayashi, Ken-ichi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [2] Least Squares Regression with Markovian Data: Fundamental Limits and Algorithms
    Bresler, Guy
    Jain, Prateek
    Nagaraj, Dheeraj
    Netrapalli, Praneeth
    Wu, Xian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] Incremental Sparse Linear Regression Algorithms for Face Recognition
    Liu, Xiaolan
    Liu, Jiao
    Liu, Xiaolan
    Kong, Zhaoming
    Yang, Xiaowei
    PROCEEDINGS OF 2018 TENTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2018, : 69 - 74
  • [4] SIMPLE ALGORITHMS FOR SPARSE LINEAR REGRESSION WITH UNCERTAIN COVARIATES
    Chen, Yudong
    Caramanis, Constantine
    2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 413 - 415
  • [5] A UNIFIED VIEW OF DECENTRALIZED ALGORITHMS FOR SPARSE LINEAR REGRESSION
    Maros, Marie
    Scutari, Gesualdo
    2023 IEEE 9TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, CAMSAP, 2023, : 471 - 475
  • [6] Enhancing the fundamental limits of sparsity pattern recovery
    Shaeiri, Zahra
    Karami, Mohammad-Reza
    Aghagolzadeh, Ali
    DIGITAL SIGNAL PROCESSING, 2017, 69 : 275 - 285
  • [7] On the Fundamental Limits of Recovering Tree Sparse Vectors from Noisy Linear Measurements
    Soni, Akshay
    Haupt, Jarvis
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (01) : 133 - 149
  • [8] ALGORITHMS FOR ROBUST LINEAR REGRESSION BY EXPLOITING THE CONNECTION TO SPARSE SIGNAL RECOVERY
    Jin, Yuzhe
    Rao, Bhaskar D.
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 3830 - 3833
  • [9] Reducing Complexity of Echo State Networks with Sparse Linear Regression Algorithms
    Ceperic, Vladimir
    Baric, Adrijan
    2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION (UKSIM), 2014, : 26 - 31
  • [10] Linear Bandit Algorithms with Sublinear Time Complexity
    Yang, Shuo
    Ren, Tongzheng
    Shakkottai, Sanjay
    Price, Eric
    Dhillon, Inderjit S.
    Sanghavi, Sujay
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,