UPPER AND LOWER BOUNDS FOR APPROXIMATION OF THE KULLBACK-LEIBLER DIVERGENCE BETWEEN HIDDEN MARKOV MODELS

被引:0
|
作者
Li, Haiyang [1 ]
Han, Jiqing [1 ]
Zheng, Tieran [1 ]
Zheng, Guibin [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150006, Peoples R China
关键词
Kullback-Leibler divergence; Hidden Markov model; automatic speech recognition; speech processing;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
The Kullback-Leibler (KL) divergence is often used for a similarity comparison between two Hidden Markov models (HMMs). However, there is no closed form expression for computing the KL divergence between HMMs, and it can only be approximated. In this paper, we propose two novel methods for approximating the KL divergence between the left-to-right transient HMMs. The first method is a product approximation which can be calculated recursively without introducing extra parameters. The second method is based on the upper and lower bounds of KL divergence, and the mean of these bounds provides an available approximation of the divergence. We demonstrate the effectiveness of the proposed methods through experiments including the deviations to the numerical approximation and the task of predicting the confusability of phone pairs. Experimental resuls show that the proposed product approximation is comparable with the current variational approximation, and the proposed approximation based on bounds performs better than current methods in the experiments.
引用
收藏
页码:7609 / 7613
页数:5
相关论文
共 50 条
  • [21] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [22] Comparison of approximation methods to Kullback-Leibler divergence between Gaussian mixture models for satellite image retrieval
    Cui, Shiyong
    REMOTE SENSING LETTERS, 2016, 7 (07) : 651 - 660
  • [23] Dysarthric Speech Recognition Using Kullback-Leibler Divergence-based Hidden Markov Model
    Kim, Myungjong
    Wang, Jun
    Kim, Hoirin
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 2671 - 2675
  • [24] Close Approximation of Kullback-Leibler Divergence for Sparse Source Retrieval
    Ghodhbani, Emna
    Kaaniche, Mounir
    Benazza-Benyahia, Amel
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (05) : 745 - 749
  • [25] KULLBACK-LEIBLER UPPER CONFIDENCE BOUNDS FOR OPTIMAL SEQUENTIAL ALLOCATION
    Cappe, Olivier
    Garivier, Aurelien
    Maillard, Odalric-Ambrym
    Munos, Remi
    Stoltz, Gilles
    ANNALS OF STATISTICS, 2013, 41 (03): : 1516 - 1541
  • [26] COMPARISON OF KULLBACK-LEIBLER DIVERGENCE APPROXIMATION METHODS BETWEEN GAUSSIAN MIXTURE MODELS FOR SATELLITE IMAGE RETRIEVAL
    Cui, Shiyong
    Datcu, Mihai
    2015 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2015, : 3719 - 3722
  • [27] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [28] The Kullback-Leibler Divergence Between Lattice Gaussian Distributions
    Nielsen, Frank
    JOURNAL OF THE INDIAN INSTITUTE OF SCIENCE, 2022, 102 (04) : 1177 - 1188
  • [29] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [30] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593