Information-theoretic lower bounds for compressive sensing with generative models

被引:18
|
作者
Liu Z. [1 ,2 ]
Scarlett J. [1 ,2 ,3 ]
机构
[1] The Department of Computer Science, School of Computing, National University of Singapore, Singapore
[2] The Department of Computer Science, School of Computing, National University of Singapore, Singapore
[3] The Department of Mathematics, National University of Singapore, Singapore
基金
新加坡国家研究基金会;
关键词
Compressive sensing; Generative models; Information-theoretic limits; Neural networks; Sparsity;
D O I
10.1109/JSAIT.2020.2980676
中图分类号
学科分类号
摘要
It has recently been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitably-chosen generative model. In particular, in (Bora et al., 2017) it was shown roughly O(k log L) random Gaussian measurements suffice for accurate recovery when the generative model is an L-Lipschitz function with bounded k-dimensional inputs, and O(kd log w) measurements suffice when the generative model is a k-input ReLU network with depth d and width w. In this paper, we establish corresponding algorithm-independent lower bounds on the sample complexity using tools from minimax statistical analysis. In accordance with the above upper bounds, our results are summarized as follows: (i) We construct an L-Lipschitz generative model capable of generating group-sparse signals, and show that the resulting necessary number of measurements is Ω(k log L); (ii) Using similar ideas, we construct ReLU networks with high depth and/or high depth for which the necessary number of measurements scales as Ω(kd loglogwn ) (with output dimension n), and in some cases Ω(kd log w). As a result, we establish that the scaling laws derived in (Bora et al., 2017) are optimal or near-optimal in the absence of further assumptions. © 2020 IEEE.
引用
收藏
页码:292 / 303
页数:11
相关论文
共 50 条
  • [21] PIR with Client-Side Preprocessing: Information-Theoretic Constructions and Lower Bounds
    Ishai, Yuval
    Shi, Elaine
    Wichs, Daniel
    ADVANCES IN CRYPTOLOGY - CRYPTO 2024, PT IX, 2024, 14928 : 148 - 182
  • [22] Improved lower bounds for learning from noisy examples: An information-theoretic approach
    Gentile, C
    Helmbold, DP
    INFORMATION AND COMPUTATION, 2001, 166 (02) : 133 - 155
  • [23] Intrinsic Information-Theoretic Models
    Bernal-Casas, D.
    Oller, J. M.
    ENTROPY, 2024, 26 (05)
  • [24] Strengthened Information-theoretic Bounds on the Generalization Error
    Issa, Ibrahim
    Esposito, Amedeo Roberto
    Gastpar, Michael
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 582 - 586
  • [25] Information-Theoretic Lower Bounds to an Average Distortion for the Models of Coding Independent Continuous Letters and Estimating a Distribution Parameter
    M. M. Lange
    A. M. Lange
    Pattern Recognition and Image Analysis, 2024, 34 (4) : 1141 - 1149
  • [26] Information-Theoretic Bounds for Adaptive Sparse Recovery
    Aksoylar, Cem
    Saligrama, Venkatesh
    2014 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2014, : 1311 - 1315
  • [27] Information-theoretic bounds for steganography in visual multimedia
    El-Arsh, Hassan Y.
    Abdelaziz, Amr
    Elliethy, Ahmed
    Aly, H. A.
    Gulliver, T. Aaron
    JOURNAL OF INFORMATION SECURITY AND APPLICATIONS, 2025, 89
  • [28] Information-Theoretic Confidence Bounds for Reinforcement Learning
    Lu, Xiuyuan
    Van Roy, Benjamin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [29] Nonlinear Information-Theoretic Compressive Measurement Design
    Wang, Liming
    Razi, Abolfazl
    Rodrigues, Miguel Dias
    Calderbank, Robert
    Carin, Lawrence
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1161 - 1169
  • [30] Information-theoretic bounds on target recognition performance
    Jain, A
    Moulin, P
    Miller, MI
    Ramchandran, K
    AUTOMATIC TARGET RECOGNITION X, 2000, 4050 : 347 - 358