共 50 条
The Entropy of Sums and Rusza's Divergence on Abelian Groups
被引:0
|作者:
Kontoyiannis, Ioannis
[1
]
Madiman, Mokshay
[2
]
机构:
[1] Athens Univ Econ & Business, Dept Informat, Athens, Greece
[2] Univ Delaware, Dept Math Sci, Newark, DE USA
来源:
2013 IEEE INFORMATION THEORY WORKSHOP (ITW)
|
2013年
基金:
美国国家科学基金会;
关键词:
D O I:
暂无
中图分类号:
TP301 [理论、方法];
学科分类号:
081202 ;
摘要:
Motivated by a series of recently discovered inequalities for the sum and difference of discrete or continuous random variables [3], [5], [9], [10], we argue that the most natural, general form of these results is in terms of a special case of a mutual information, which we call the Ruzsa divergence between two probability distributions. This can be defined for arbitrary pairs of random variables taking values in any discrete (countable) set, on R-n, or in fact on any locally compact Hausdorff abelian group. We study the basic properties of the Rusza divergence and derive numerous consequences. In particular, we show that many of the inequalities in [3], [5], [9], [10] can be stated and proved in a unified way, extending their validity to the present general setting. For example, consequences of the basic properties of the Ruzsa divergence developed here include the fact that the entropies of the sum and the difference of two independent random vectors severely constrain each other, as well as entropy analogues of a number of results in additive combinatorics. Although the setting is quite general, the results are already of interest (and new) in the case of random vectors in R-n. For instance, another consequence in R-n is an entropic analogue (in the setting of log-concave distributions) of the Rogers-Shephard inequality for convex bodies.
引用
收藏
页数:2
相关论文