Variational Mixtures of Gaussian Processes for Classification

被引:0
|
作者
Luo, Chen [1 ]
Sun, Shiliang [1 ]
机构
[1] East China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200062, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian Processes (GPs) are powerful tools for machine learning which have been applied to both classification and regression. The mixture models of GPs were later proposed to further improve GPs for data modeling. However, these models are formulated for regression problems. In this work, we propose a new Mixture of Gaussian Processes for Classification (MGPC). Instead of the Gaussian likelihood for regression, MGPC employs the logistic function as likelihood to obtain the class probabilities, which is suitable for classification problems. The posterior distribution of latent variables is approximated through variational inference. The hyperparameters are optimized through the variational EM method and a greedy algorithm. Experiments are performed on multiple real-world datasets which show improvements over five widely used methods on predictive performance. The results also indicate that for classification MGPC is significantly better than the regression model with mixtures of GPs, different from the existing consensus that their single model counterparts are comparable.
引用
收藏
页码:4603 / 4609
页数:7
相关论文
共 50 条
  • [31] Variational Gaussian Processes: A Functional Analysis View
    Wild, Veit
    Wynne, George
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [32] Stochastic complexities of Gaussian mixtures in variational Bayesian approximation
    Watanabe, K
    Watanabe, S
    JOURNAL OF MACHINE LEARNING RESEARCH, 2006, 7 : 625 - 643
  • [33] Semisupervised Classification With Sequence Gaussian Mixture Variational Autoencoder
    Wang, Shuangqing
    Yu, Jianbo
    Li, Zhi
    Chai, Tianyou
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (09) : 11540 - 11548
  • [34] BENIGN OVERFITTING IN BINARY CLASSIFICATION OF GAUSSIAN MIXTURES
    Wang, Ke
    Thrampoulidis, Christos
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4030 - 4034
  • [35] Variational Bayesian multinomial logistic Gaussian process classification
    Wanhyun Cho
    Inseop Na
    Sangkyoon Kim
    Soonyoung Park
    Multimedia Tools and Applications, 2018, 77 : 18563 - 18582
  • [36] Object classification by fusing SVMs and Gaussian mixtures
    Deselaers, Thomas
    Heigold, Georg
    Ney, Hermann
    PATTERN RECOGNITION, 2010, 43 (07) : 2476 - 2484
  • [37] Variational Bayesian multinomial logistic Gaussian process classification
    Cho, Wanhyun
    Na, Inseop
    Kim, Sangkyoon
    Park, Soonyoung
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (14) : 18563 - 18582
  • [38] Growing Gaussian mixtures network for classification applications
    Alba, JL
    Docío, L
    Docampo, D
    Márquez, OW
    SIGNAL PROCESSING, 1999, 76 (01) : 43 - 60
  • [39] Learning spatial patterns with variational Gaussian processes: Regression
    Goncalves, Italo Gomes
    Guadagnin, Felipe
    Cordova, Diogo Peixoto
    COMPUTERS & GEOSCIENCES, 2022, 161
  • [40] Promoting active learning with mixtures of Gaussian processes
    Zhao, Jing
    Sun, Shiliang
    Wang, Huijuan
    Cao, Zehui
    KNOWLEDGE-BASED SYSTEMS, 2020, 188