The Entropic Doubling Constant and Robustness of Gaussian Codebooks for Additive-Noise Channels

被引:0
|
作者
Gavalakis, Lampros [1 ]
Kontoyiannis, Ioannis [2 ]
Madiman, Mokshay [3 ]
机构
[1] Univ Paris Est Creteil, Univ Gustave Eiffel, CNRS, LAMA,UMR 8050, F-77447 Marne La Vallee, France
[2] Univ Cambridge, Stat Lab, Cambridge CB3 0WB, England
[3] Univ Delaware, Dept Math Sci, Newark, DE 19716 USA
关键词
Entropy inequalities; entropic doubling; entropy power; Gaussian codebook; maximum entropy; capacity; multiple access channel; MIMO channel; EMPIRICAL DISTRIBUTION; SUPERPOSITION CODES; SHANNONS PROBLEM; INEQUALITIES; CAPACITY; MONOTONICITY; STABILITY; SUMSET;
D O I
10.1109/TIT.2024.3460472
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Entropy comparison inequalities are obtained for the differential entropy h(X+Y) of the sum of two independent random vectors X, Y, when one is replaced by a Gaussian. For identically distributed random vectors X, Y, these are closely related to bounds on the entropic doubling constant, which quantifies the entropy increase when adding an independent copy of a random vector to itself. Consequences of both large and small doubling are explored. For the former, lower bounds are deduced on the entropy increase when adding an independent Gaussian, while for the latter, a qualitative stability result for the entropy power inequality is obtained. In the more general case of non-identically distributed random vectors X, Y, a Gaussian comparison inequality with interesting implications for channel coding is established: For additive-noise channels with a power constraint, Gaussian codebooks come within a sn/3snr+2 factor of capacity. In the low-SNR regime this improves the half-abit additive bound of Zamir and Erez. Analogous results are obtained for additive-noise multiple access channels, and for linear, additive-noise
引用
收藏
页码:8467 / 8477
页数:11
相关论文
共 50 条