UPPER AND LOWER BOUNDS FOR APPROXIMATION OF THE KULLBACK-LEIBLER DIVERGENCE BETWEEN HIDDEN MARKOV MODELS

被引:0
|
作者
Li, Haiyang [1 ]
Han, Jiqing [1 ]
Zheng, Tieran [1 ]
Zheng, Guibin [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150006, Peoples R China
关键词
Kullback-Leibler divergence; Hidden Markov model; automatic speech recognition; speech processing;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
The Kullback-Leibler (KL) divergence is often used for a similarity comparison between two Hidden Markov models (HMMs). However, there is no closed form expression for computing the KL divergence between HMMs, and it can only be approximated. In this paper, we propose two novel methods for approximating the KL divergence between the left-to-right transient HMMs. The first method is a product approximation which can be calculated recursively without introducing extra parameters. The second method is based on the upper and lower bounds of KL divergence, and the mean of these bounds provides an available approximation of the divergence. We demonstrate the effectiveness of the proposed methods through experiments including the deviations to the numerical approximation and the task of predicting the confusability of phone pairs. Experimental resuls show that the proposed product approximation is comparable with the current variational approximation, and the proposed approximation based on bounds performs better than current methods in the experiments.
引用
收藏
页码:7609 / 7613
页数:5
相关论文
共 50 条
  • [41] Kullback-Leibler divergence and Markov random fields for speckled image restoration
    Bratsolis, E
    Sigelle, M
    SEVENTH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOL 1, PROCEEDINGS, 2003, : 425 - 428
  • [42] A speech enhancement algorithm based on a non-negative hidden Markov model and Kullback-Leibler divergence
    Xiang, Yang
    Shi, Liming
    Hojvang, Jesper Lisby
    Rasmussen, Morten Hojfeldt
    Christensen, Mads Graesboll
    EURASIP JOURNAL ON AUDIO SPEECH AND MUSIC PROCESSING, 2022, 2022 (01)
  • [43] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [44] Markov-switching model selection using Kullback-Leibler divergence
    Smith, Aaron
    Naik, Prasad A.
    Tsai, Chih-Ling
    JOURNAL OF ECONOMETRICS, 2006, 134 (02) : 553 - 577
  • [45] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [46] Neighbourhood Models Induced by the Euclidean Distance and the Kullback-Leibler Divergence
    Montes, Ignacio
    INTERNATIONAL SYMPOSIUM ON IMPRECISE PROBABILITY: THEORIES AND APPLICATIONS, VOL 215, 2023, 215 : 367 - 378
  • [47] Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions
    Bouhlel, Nizar
    Dziri, Ali
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (07) : 1021 - 1025
  • [48] COMPARISON OF NMF WITH KULLBACK-LEIBLER DIVERGENCE AND ITAKURA-SAITO DIVERGENCE FOR ODOR APPROXIMATION
    Prasetyawan, Dani
    Nakamoto, Takamichi
    2019 IEEE INTERNATIONAL SYMPOSIUM ON OLFACTION AND ELECTRONIC NOSE (ISOEN 2019), 2019, : 310 - 312
  • [49] On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions
    Zhang, Yufeng
    Pan, Jialu
    Li, Kenli
    Liu, Wanwei
    Chen, Zhenbang
    Liu, Xinwang
    Wang, Ji
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [50] Kullback-Leibler dissimilarity of Markov models for phylogenetic tree reconstruction
    Pham, TD
    Crane, DI
    Tannock, D
    Beck, D
    PROCEEDINGS OF THE 2004 INTERNATIONAL SYMPOSIUM ON INTELLIGENT MULTIMEDIA, VIDEO AND SPEECH PROCESSING, 2004, : 157 - 160