Variational Bayesian inference for a Dirichlet process mixture of beta distributions and application

被引:7
|
作者
Lai, Yuping [1 ]
Ping, Yuan [2 ]
Xiao, Ke [1 ]
Hao, Bin [3 ]
Zhang, Xiufeng [4 ]
机构
[1] North China Univ Technol, Coll Comp Sci & Technol, Beijing, Peoples R China
[2] Xuchang Univ, Sch Informat Engn, Xuchang, Peoples R China
[3] Chinese Univ Hong Kong, Inst Network Coding, Shatin, Hong Kong, Peoples R China
[4] Natl Res Ctr Rehabil Tech Aids, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Dirichlet process; Nonparametric Bayesian analysis; Beta distribution; Infinite mixture model; Variational inference; Image categorization; Object detection; HIDDEN MARKOV MODEL; INFORMATION CRITERION; CLASSIFICATION;
D O I
10.1016/j.neucom.2017.07.068
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Finite beta mixture model (BMM) has been shown to be very flexible and powerful for bounded support data modeling. However, BMM cannot automatically select the proper number of the mixture components based on the observed data, which is important and has a deterministic effect on the modeling accuracy. In this paper, we aim at tackling this problem by infinite Beta mixture model (InBMM). It is based on the Dirichlet process (DP) mixture with the assumption that the number of the mixture components is infinite in advance and can be automatically determined according to the observed data. Further, a variational InBMM using single lower- bound approximation (VBInBMM) is proposed which applies the stick-breaking representation of the DP and is learned by an extended variational inference framework. Numerical experiments on both synthetic and real data, generated from two challenging application namely image categorization and object detection, demonstrate good performance obtained by the proposed method. (C) 2017 Published by Elsevier B. V.
引用
收藏
页码:23 / 33
页数:11
相关论文
共 50 条
  • [31] Variational Bayesian inference with Gaussian-mixture approximations
    Zobay, O.
    ELECTRONIC JOURNAL OF STATISTICS, 2014, 8 : 355 - 389
  • [32] Reliable and Scalable Variational Inference for the Hierarchical Dirichlet Process
    Hughes, Michael C.
    Kim, Dae Il
    Sudderth, Erik B.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 370 - 378
  • [33] Online Variational Learning of Dirichlet Process Mixtures of Scaled Dirichlet Distributions
    Narges Manouchehri
    Hieu Nguyen
    Pantea Koochemeshkian
    Nizar Bouguila
    Wentao Fan
    Information Systems Frontiers, 2020, 22 : 1085 - 1093
  • [34] Online Variational Learning of Dirichlet Process Mixtures of Scaled Dirichlet Distributions
    Manouchehri, Narges
    Nguyen, Hieu
    Koochemeshkian, Pantea
    Bouguila, Nizar
    Fan, Wentao
    INFORMATION SYSTEMS FRONTIERS, 2020, 22 (05) : 1085 - 1093
  • [35] Mean field inference for the Dirichlet process mixture model
    Zobay, O.
    ELECTRONIC JOURNAL OF STATISTICS, 2009, 3 : 507 - 545
  • [36] Bayesian Inference of a Finite Mixture of Inverse Weibull Distributions with an Application to Doubly Censoring Data
    Feroze, Navid
    PAKISTAN JOURNAL OF STATISTICS AND OPERATION RESEARCH, 2016, 12 (01) : 53 - 72
  • [37] Bayesian inference for dynamic models with dirichlet process mixtures
    Caron, Francois
    Davy, Manuel
    Doucet, Arnaud
    Duflos, Emmanuel
    Vanheeghe, Philippe
    2006 9th International Conference on Information Fusion, Vols 1-4, 2006, : 138 - 145
  • [38] A hierarchical Dirichlet process mixture of generalized Dirichlet distributions for feature selection
    Fan, Wentao
    Sallay, Hassen
    Bouguila, Nizar
    Bourouis, Sami
    COMPUTERS & ELECTRICAL ENGINEERING, 2015, 43 : 48 - 65
  • [39] A Dirichlet Process Mixture of Generalized Dirichlet Distributions for Proportional Data Modeling
    Bouguila, Nizar
    Ziou, Djemel
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (01): : 107 - 122
  • [40] Variational Bayesian inference for infinite generalized inverted Dirichlet mixtures with feature selection and its application to clustering
    Bdiri, Taoufik
    Bouguila, Nizar
    Ziou, Djemel
    APPLIED INTELLIGENCE, 2016, 44 (03) : 507 - 525