Information-theoretic lower bounds for compressive sensing with generative models

被引:18
|
作者
Liu Z. [1 ,2 ]
Scarlett J. [1 ,2 ,3 ]
机构
[1] The Department of Computer Science, School of Computing, National University of Singapore, Singapore
[2] The Department of Computer Science, School of Computing, National University of Singapore, Singapore
[3] The Department of Mathematics, National University of Singapore, Singapore
基金
新加坡国家研究基金会;
关键词
Compressive sensing; Generative models; Information-theoretic limits; Neural networks; Sparsity;
D O I
10.1109/JSAIT.2020.2980676
中图分类号
学科分类号
摘要
It has recently been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitably-chosen generative model. In particular, in (Bora et al., 2017) it was shown roughly O(k log L) random Gaussian measurements suffice for accurate recovery when the generative model is an L-Lipschitz function with bounded k-dimensional inputs, and O(kd log w) measurements suffice when the generative model is a k-input ReLU network with depth d and width w. In this paper, we establish corresponding algorithm-independent lower bounds on the sample complexity using tools from minimax statistical analysis. In accordance with the above upper bounds, our results are summarized as follows: (i) We construct an L-Lipschitz generative model capable of generating group-sparse signals, and show that the resulting necessary number of measurements is Ω(k log L); (ii) Using similar ideas, we construct ReLU networks with high depth and/or high depth for which the necessary number of measurements scales as Ω(kd loglogwn ) (with output dimension n), and in some cases Ω(kd log w). As a result, we establish that the scaling laws derived in (Bora et al., 2017) are optimal or near-optimal in the absence of further assumptions. © 2020 IEEE.
引用
收藏
页码:292 / 303
页数:11
相关论文
共 50 条
  • [1] Models and information-theoretic bounds for nanopore sequencing
    Mao, Wei
    Diggavi, Suhas
    Kannan, Sreeram
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 2458 - 2462
  • [2] Models and Information-Theoretic Bounds for Nanopore Sequencing
    Mao, Wei
    Diggavi, Suhas N.
    Kannan, Sreeram
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (04) : 3216 - 3236
  • [3] Information-Theoretic Lower Bounds for Distributed Function Computation
    Xu, Aolin
    Raginsky, Maxim
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (04) : 2314 - 2337
  • [4] Information-theoretic upper and lower bounds for statistical estimation
    Zhang, T
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (04) : 1307 - 1321
  • [5] Information Theoretic Performance Bounds for Noisy Compressive Sensing
    Chen, Junjie
    Liang, Qilian
    Zhang, Baoju
    Wu, Xiaorong
    2013 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS (IEEE ICC), 2013, : 972 - 976
  • [6] Information-theoretic lower bounds for convex optimization with erroneous oracles
    Singer, Yaron
    Vondrak, Jan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [7] Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds
    Reeves, Galen
    Gastpar, Michael C.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2013, 59 (06) : 3451 - 3465
  • [8] Information-Theoretic Lower Bounds for Recovery of Diffusion Network Structures
    Park, Keehwan
    Honorio, Jean
    2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 1346 - 1350
  • [9] Information-Theoretic Lower Bounds on Bayes Risk in Decentralized Estimation
    Xu, Aolin
    Raginsky, Maxim
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (03) : 1580 - 1600
  • [10] Structure Learning of Similar Ising Models: Information-theoretic Bounds
    Sihag, Saurabh
    Tajer, Ali
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1307 - 1311