On the Hierarchical Bernoulli Mixture Model Using Bayesian Hamiltonian Monte Carlo

被引:0
|
作者
Suryaningtyas, Wahyuni [1 ,2 ]
Iriawan, Nur [1 ]
Kuswanto, Heri [1 ]
Zain, Ismaini [1 ]
机构
[1] Inst Teknol Sepuluh Nopember, Fac Sci & Data Analyt, Dept Stat, Surabaya 60111, Indonesia
[2] Univ Muhammadiyah Surabaya, Study Program Math Educ, Fac Teacher Training & Educ, Jl Sutorejo 59, Surabaya 60113, Indonesia
来源
SYMMETRY-BASEL | 2021年 / 13卷 / 12期
关键词
Bernoulli mixture model; finite mixture; Hamiltonian Monte Carlo; WAIC; MULTILEVEL MODELS;
D O I
10.3390/sym13122404
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
The model developed considers the uniqueness of a data-driven binary response (indicated by 0 and 1) identified as having a Bernoulli distribution with finite mixture components. In social science applications, Bernoulli's constructs a hierarchical structure data. This study introduces the Hierarchical Bernoulli mixture model (Hibermimo), a new analytical model that combines the Bernoulli mixture with hierarchical structure data. The proposed approach uses a Hamiltonian Monte Carlo algorithm with a No-U-Turn Sampler (HMC/NUTS). The study has performed a compatible syntax program computation utilizing the HMC/NUTS to analyze the Bayesian Bernoulli mixture aggregate regression model (BBMARM) and Hibermimo. In the model estimation, Hibermimo yielded a result of ~90% compliance with the modeling of each district and a small Widely Applicable Information Criteria (WAIC) value.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Decentralized Bayesian learning with Metropolis-adjusted Hamiltonian Monte Carlo
    Vyacheslav Kungurtsev
    Adam Cobb
    Tara Javidi
    Brian Jalaian
    Machine Learning, 2023, 112 : 2791 - 2819
  • [42] Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
    Zhize Li
    Tianyi Zhang
    Shuyu Cheng
    Jun Zhu
    Jian Li
    Machine Learning, 2019, 108 : 1701 - 1727
  • [43] Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
    Li, Zhize
    Zhang, Tianyi
    Cheng, Shuyu
    Zhu, Jun
    Li, Jian
    MACHINE LEARNING, 2019, 108 (8-9) : 1701 - 1727
  • [44] Decentralized Bayesian learning with Metropolis-adjusted Hamiltonian Monte Carlo
    Kungurtsev, Vyacheslav
    Cobb, Adam
    Javidi, Tara
    Jalaian, Brian
    MACHINE LEARNING, 2023, 112 (08) : 2791 - 2819
  • [45] Bayesian estimation of an autoregressive model using Markov chain Monte Carlo
    Barnett, G
    Kohn, R
    Sheather, S
    JOURNAL OF ECONOMETRICS, 1996, 74 (02) : 237 - 254
  • [46] Model Discrimination in Copolymerization Using the Sequential Bayesian Monte Carlo Method
    Masoumi, Samira
    Duever, Thomas A.
    MACROMOLECULAR THEORY AND SIMULATIONS, 2016, 25 (05) : 435 - 448
  • [47] Bayesian model comparison by Monte Carlo chaining
    Barber, D
    Bishop, CM
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 9: PROCEEDINGS OF THE 1996 CONFERENCE, 1997, 9 : 333 - 339
  • [48] A Monte Carlo investigation of the Hamiltonian mean field model
    Pluchino, A
    Andronico, G
    Rapisarda, A
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2005, 349 (1-2) : 143 - 154
  • [49] Wavelet-based Bayesian denoising using Bernoulli-Gaussian mixture model
    Eom, IK
    Kim, YS
    Visual Communications and Image Processing 2005, Pts 1-4, 2005, 5960 : 316 - 324
  • [50] A GENERAL NON-SMOOTH HAMILTONIAN MONTE CARLO SCHEME USING BAYESIAN PROXIMITY OPERATOR CALCULATION
    Chaari, Lotfi
    Tourneret, Jean-Yves
    Batatia, Hadj
    2017 25TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2017, : 1220 - 1224