Learning from Inconsistent and Unreliable Annotators by a Gaussian Mixture Model and Bayesian Information Criterion

被引:0
|
作者
Zhang, Ping [1 ]
Obradovic, Zoran [1 ]
机构
[1] Temple Univ, Ctr Data Analyt & Biomed Informat, Philadelphia, PA 19122 USA
关键词
multiple noisy experts; data-dependent experts; Gaussian mixture model; Bayesian information criterion;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Supervised learning from multiple annotators is an increasingly important problem in machine leaning and data mining. This paper develops a probabilistic approach to this problem when annotators are not only unreliable, but also have varying performance depending on the data. The proposed approach uses a Gaussian mixture model (GMM) and Bayesian information criterion (BIC) to find the fittest model to approximate the distribution of the instances. Then the maximum a posterior (MAP) estimation of the hidden true labels and the maximum-likelihood (ML) estimation of quality of multiple annotators are provided alternately. Experiments on emotional speech classification and CASP9 protein disorder prediction tasks show performance improvement of the proposed approach as compared to the majority voting baseline and a previous data-independent approach. Moreover, the approach also provides more accurate estimates of individual annotators performance for each Gaussian component, thus paving the way for understanding the behaviors of each annotator.
引用
收藏
页码:553 / 568
页数:16
相关论文
共 50 条
  • [41] Community Embeddings with Bayesian Gaussian Mixture Model and Variational Inference
    Begehr, Anton I. N.
    Panfilov, Peter B.
    2022 IEEE 24TH CONFERENCE ON BUSINESS INFORMATICS (CBI 2022), VOL 2, 2022, : 88 - 96
  • [42] Comments on "A critique of the Bayesian information criterion for model selection"
    Firth, D
    Kuha, J
    SOCIOLOGICAL METHODS & RESEARCH, 1999, 27 (03) : 398 - 402
  • [43] Blind separation of instantaneous mixture of sources via the Gaussian mutual information criterion
    Pham, DT
    SIGNAL PROCESSING, 2001, 81 (04) : 855 - 870
  • [44] Forest construction of Gaussian and discrete variables with the application of Watanabe Bayesian Information Criterion
    Islam A.
    Suzuki J.
    Behaviormetrika, 2024, 51 (2) : 589 - 616
  • [45] A Bayesian sampling framework for asymmetric generalized Gaussian mixture models learning
    Vemuri, Ravi Teja
    Azam, Muhammad
    Bouguila, Nizar
    Patterson, Zachary
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (17): : 14123 - 14134
  • [46] Bayesian Learning of Infinite Asymmetric Gaussian Mixture Models for Background Subtraction
    Song, Ziyang
    Ali, Samr
    Bouguila, Nizar
    IMAGE ANALYSIS AND RECOGNITION, ICIAR 2019, PT I, 2019, 11662 : 264 - 274
  • [47] Gaussian Mixture Filter Based on Variational Bayesian Learning in PPP/SINS
    Dai, Qing
    Sui, Lifen
    Tian, Yuan
    Zeng, Tian
    CHINA SATELLITE NAVIGATION CONFERENCE (CSNC) 2017 PROCEEDINGS, VOL II, 2017, 438 : 429 - 444
  • [48] A Bayesian sampling framework for asymmetric generalized Gaussian mixture models learning
    Ravi Teja Vemuri
    Muhammad Azam
    Nizar Bouguila
    Zachary Patterson
    Neural Computing and Applications, 2022, 34 : 14123 - 14134
  • [49] Large Margin Learning of Bayesian Classifiers Based on Gaussian Mixture Models
    Pernkopf, Franz
    Wohlmayr, Michael
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2010, 6323 : 50 - 66
  • [50] Asymmetric Mixture Model with Variational Bayesian Learning
    Thanh Minh Nguyen
    Wu, Q. M. Jonathan
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 285 - 290