Mismatch in high-rate entropy-constrained vector quantization

被引:46
|
作者
Gray, RM [1 ]
Linder, T
机构
[1] Stanford Univ, Dept Elect Engn, Informat Syst Lab, Stanford, CA 94305 USA
[2] Queens Univ, Dept Math & Stat, Kingston, ON K7L 3N6, Canada
基金
美国国家科学基金会;
关键词
entropy constrained; high rate; Kullback-Leibler divergence; Lagrangian; mismatch; quantization; relative entropy; variable rate;
D O I
10.1109/TIT.2003.810637
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Bucklew's high-rate vector quantizer mismatch result is extended from fixed-rate coding to variable-rate coding using a Lagrangian formulation. It is shown that if an asymptotically (high-rate) optimal. sequence of variable rate codes is designed for a k-dimensional probability density function (pdf) g and then applied to another pdf f for which f/g is bounded, then the resulting mismatch or loss of performance from the optimal possible is given by the relative entropy or Kullback-Leibler divergence I(fparallel tog). It is also shown. that under the same assumptions, an asymptotically optimal code sequence for g can be converted to an asymptotically optimal code sequence for a mismatched source f by modifying only the lossless component of the code. Applications to quantizer design using uniform and Gaussian densities are described, including a high-rate analog to the Shannon rate-distortion result of Sakrison and Lapidoth showing that the Gaussian is the "worst case" for lossy compression of a source with known covariance. By coupling the mismatch result with composite quantizers, the worst case properties of uniform and Gaussian densities are extended to conditionally uniform and Gaussian densities, which provides a Lloyd clustering algorithm for fitting mixtures to general densities.
引用
收藏
页码:1204 / 1217
页数:14
相关论文
共 50 条
  • [21] A novel entropy-constrained competitive learning algorithm for vector quantization
    Hwang, WJ
    Ye, BY
    Liao, SC
    NEUROCOMPUTING, 1999, 25 (1-3) : 133 - 147
  • [22] On Entropy-Constrained Vector Quantization using Gaussian Mixture Models
    Zhao, David Y.
    Samuelsson, Jonas
    Nilsson, Mattias
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2008, 56 (12) : 2094 - 2104
  • [23] Entropy Density and Mismatch in High-Rate Scalar Quantization With Renyi Entropy Constraint
    Kreitmeier, Wolfgang
    Linder, Tamas
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (07) : 4105 - 4116
  • [24] Entropy-constrained gradient-match vector quantization for image coding
    Juan, SC
    Lee, CY
    PROCEEDINGS OF THE 1998 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-6, 1998, : 2665 - 2668
  • [25] Color Retinal Image Coding Based on Entropy-Constrained Vector Quantization
    Setiawan, Agung W.
    Suksmono, Andriyan B.
    Mengko, Tati R.
    SECOND INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING, 2010, 7546
  • [26] A fast PNN design algorithm for entropy-constrained residual vector quantization
    Kossentini, F
    Smith, MJT
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 1998, 7 (07) : 1045 - 1050
  • [27] On optimal entropy-constrained deadzone quantization
    Tao, B
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2001, 11 (04) : 560 - 563
  • [28] IMAGE-CODING USING ENTROPY-CONSTRAINED RESIDUAL VECTOR QUANTIZATION
    KOSSENTINI, F
    SMITH, MJT
    BARNES, CF
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 1995, 4 (10) : 1349 - 1357
  • [29] Image coding using entropy-constrained reflected residual vector quantization
    Khan, MAU
    Mousa, WAH
    2002 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL I, PROCEEDINGS, 2002, : 253 - 256
  • [30] Joint entropy-constrained multiterminal quantization
    Cardinal, J
    Van Assche, G
    ISIT: 2002 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, PROCEEDINGS, 2002, : 63 - 63