Extrinsic Jensen-Shannon Divergence: Applications to Variable-Length Coding

被引:38
|
作者
Naghshvar, Mohammad [1 ]
Javidi, Tara [2 ]
Wigger, Michele [3 ]
机构
[1] Univ Calif San Diego, Dept Elect & Comp Engn, San Diego, CA 92093 USA
[2] Univ Calif San Diego, Dept Elect & Comp Engn, La Jolla, CA 92093 USA
[3] Telecom ParisTech, Dept Commun & Elect, F-75013 Paris, France
关键词
Discrete memoryless channel; variable-length coding; sequential analysis; feedback gain; Burnashev's reliability function; optimal error exponent; FEEDBACK; COMMUNICATION;
D O I
10.1109/TIT.2015.2401004
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers the problem of variable-length coding over a discrete memoryless channel with noiseless feedback. This paper provides a stochastic control view of the problem whose solution is analyzed via a newly proposed symmetrized divergence, termed extrinsic Jensen-Shannon (EJS) divergence. It is shown that strictly positive lower bounds on EJS divergence provide nonasymptotic upper bounds on the expected code length. This paper presents strictly positive lower bounds on EJS divergence, and hence nonasymptotic upper bounds on the expected code length, for the following two coding schemes: 1) variable-length posterior matching and 2) MaxEJS coding scheme that is based on a greedy maximization of the EJS divergence. As an asymptotic corollary of the main results, this paper also provides a rate-reliability test. Variable-length coding schemes that satisfy the condition(s) of the test for parameters R and E are guaranteed to achieve a rate R and an error exponent E. The results are specialized for posterior matching and MaxEJS to obtain deterministic one-phase coding schemes achieving capacity and optimal error exponent. For the special case of symmetric binary-input channels, simpler deterministic schemes of optimal performance are proposed and analyzed.
引用
收藏
页码:2148 / 2164
页数:17
相关论文
共 50 条
  • [1] On a Generalization of the Jensen-Shannon Divergence and the Jensen-Shannon Centroid
    Nielsen, Frank
    ENTROPY, 2020, 22 (02)
  • [2] The Jensen-Shannon divergence
    Menendez, ML
    Pardo, JA
    Pardo, L
    Pardo, MC
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 1997, 334B (02): : 307 - 318
  • [3] Extrinsic Jensen-Shannon Divergence with Application in Active Hypothesis Testing
    Naghshvar, Mohammad
    Javidi, Tara
    2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [4] Extrinsic Jensen-Shannon Divergence and Noisy Bayesian Active Learning
    Naghshvar, Mohammad
    Javidi, Tara
    Chaudhuri, Kamalika
    2013 51ST ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2013, : 1128 - 1135
  • [5] Cumulative α-Jensen-Shannon measure of divergence: Properties and applications
    Riyahi, H.
    Baratnia, M.
    Doostparast, M.
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024, 53 (17) : 5989 - 6011
  • [6] On the Jensen-Shannon divergence and variational distance
    Tsai, SC
    Tzeng, WG
    Wu, HL
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (09) : 3333 - 3336
  • [7] Jensen-Shannon divergence as a measure of the degree of entanglement
    Majtey, A. P.
    Borras, A.
    Casas, M.
    Lamberti, P. W.
    Plastino, A.
    INTERNATIONAL JOURNAL OF QUANTUM INFORMATION, 2008, 6 : 715 - 720
  • [8] Melodic Segmentation Using the Jensen-Shannon Divergence
    Lopez, Marcelo E. Rodriguez
    Volk, Anja
    2012 11TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2012), VOL 2, 2012, : 351 - 356
  • [9] Jensen-Shannon divergence and Hilbert space embedding
    Fuglede, B
    Topsoe, F
    2004 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, PROCEEDINGS, 2004, : 31 - 31
  • [10] Generalized quantum Jensen-Shannon divergence of imaginarity
    Tian, Peihua
    Sun, Yuan
    PHYSICS LETTERS A, 2025, 544