information;
entropy;
moments of information;
variance of information;
COMMUNICATION-THEORY;
D O I:
暂无
中图分类号:
TP31 [计算机软件];
学科分类号:
081202 ;
0835 ;
摘要:
The entropy of a finite probability space or, equivalently, a memoryless source is the average information content of an event. The fact that entropy is an expectation suggests that it could be quite important in certain applications to take into account higher moments of information and parameters derived from these like the variance or skewness. In this paper we initiate a study of the higher moments of information for sources without memory and sources with memory. We derive properties of these moments for information defined in the sense of Shannon and indicate how these considerations can be extended to include the concepts of information in the sense of Aczel or Renyi. For memoryless sources, these concepts are immediately supported by the usual definitions of moments; for general stationary sources, let alone general sources, no such applicable framework seems to exist; on the other hand, the special properties of stationary Markov sources suggest such definitions which are both, well-motivated and mathematically meaningful.
机构:
Beijing Normal Univ, Sch Math Sci, Lab Math & Complex Syst, Beijing 100875, Peoples R ChinaBeijing Normal Univ, Sch Math Sci, Lab Math & Complex Syst, Beijing 100875, Peoples R China