Variational Mixtures of Gaussian Processes for Classification

被引:0
|
作者
Luo, Chen [1 ]
Sun, Shiliang [1 ]
机构
[1] East China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200062, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian Processes (GPs) are powerful tools for machine learning which have been applied to both classification and regression. The mixture models of GPs were later proposed to further improve GPs for data modeling. However, these models are formulated for regression problems. In this work, we propose a new Mixture of Gaussian Processes for Classification (MGPC). Instead of the Gaussian likelihood for regression, MGPC employs the logistic function as likelihood to obtain the class probabilities, which is suitable for classification problems. The posterior distribution of latent variables is approximated through variational inference. The hyperparameters are optimized through the variational EM method and a greedy algorithm. Experiments are performed on multiple real-world datasets which show improvements over five widely used methods on predictive performance. The results also indicate that for classification MGPC is significantly better than the regression model with mixtures of GPs, different from the existing consensus that their single model counterparts are comparable.
引用
收藏
页码:4603 / 4609
页数:7
相关论文
共 50 条
  • [41] Doubly Stochastic Variational Inference for Deep Gaussian Processes
    Salimbeni, Hugh
    Deisenroth, Marc Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [42] Batch process modelling with mixtures of Gaussian processes
    Xiaoling Ou
    Elaine Martin
    Neural Computing and Applications, 2008, 17 : 471 - 479
  • [43] Batch process modelling with mixtures of Gaussian processes
    Ou, Xiaoling
    Martin, Elaine
    NEURAL COMPUTING & APPLICATIONS, 2008, 17 (5-6): : 471 - 479
  • [44] A stochastic variational framework for Recurrent Gaussian Processes models
    Mattos, Cesar Lincoln C.
    Barreto, Guilherme A.
    NEURAL NETWORKS, 2019, 112 : 54 - 72
  • [45] Variational multiple shooting for Bayesian ODEs with Gaussian processes
    Hegde, Pashupati
    Yildiz, Cagatay
    Lahdesmaki, Harri
    Kaski, Samuel
    Heinonen, Markus
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 790 - 799
  • [46] Variational Bayesian Multiple Instance Learning with Gaussian Processes
    Haussmann, Manuel
    Hamprecht, Fred A.
    Kandemir, Melih
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 810 - 819
  • [47] MIXTURES OF GAUSSIAN DISTRIBUTIONS IN BRANCHING-PROCESSES
    BELYAEV, MY
    RUSSIAN MATHEMATICAL SURVEYS, 1984, 39 (04) : 113 - 114
  • [48] Variational Inference for Gaussian Process Modulated Poisson Processes
    Lloyd, Chris
    Gunter, Tom
    Osborne, Michael A.
    Roberts, Stephen J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1814 - 1822
  • [49] Unsupervised learning of Gaussian mixtures based on variational component splitting
    Constantinopoulos, Constantinos
    Likas, Aristidis
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (03): : 745 - 755
  • [50] Dynamical Pose Filtering for Mixtures of Gaussian Processes
    Fergie, Martin
    Galata, Aphrodite
    PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2012, 2012,