Multi-class Gaussian Process Classification with Noisy Inputs

被引:0
|
作者
Villacampa-Calvo, Carlos [1 ]
Zaldivar, Bryan [2 ,3 ]
Garrido-Merchan, Eduardo C. [1 ]
Hernandez-Lobato, Daniel [1 ]
机构
[1] Univ Autonoma Madrid, Comp Sci Dept, Madrid 28049, Spain
[2] Univ Autonoma Madrid, Theoret Phys Dept, Madrid 28049, Spain
[3] Inst Fis Teor CA, Madrid 28049, Spain
关键词
Gaussian processes; Multi-class classification; Input dependent noise; VARIATIONAL INFERENCE; BAYESIAN-INFERENCE; REGRESSION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
It is a common practice in the machine learning community to assume that the observed data are noise-free in the input attributes. Nevertheless, scenarios with input noise are common in real problems, as measurements are never perfectly accurate. If this input noise is not taken into account, a supervised machine learning method is expected to perform sub-optimally. In this paper, we focus on multi-class classification problems and use Gaussian processes (GPs) as the underlying classifier. Motivated by a data set coming from the astrophysics domain, we hypothesize that the observed data may contain noise in the inputs. Therefore, we devise several multi-class GP classifiers that can account for input noise. Such classifiers can be efficiently trained using variational inference to approximate the posterior distribution of the latent variables of the model. Moreover, in some situations, the amount of noise can be known before-hand. If this is the case, it can be readily introduced in the proposed methods. This prior information is expected to lead to better performance results. We have evaluated the proposed methods by carrying out several experiments, involving synthetic and real data. These include several data sets from the UCI repository, the MNIST data set and a data set coming from astrophysics. The results obtained show that, although the classification error is similar across methods, the predictive distribution of the proposed methods is better, in terms of the test log-likelihood, than the predictive distribution of a classifier based on GPs that ignores input noise.
引用
收藏
页数:52
相关论文
共 50 条
  • [31] Classification and Categorical Inputs with Treed Gaussian Process Models
    Broderick, Tamara
    Gramacy, Robert B.
    JOURNAL OF CLASSIFICATION, 2011, 28 (02) : 244 - 270
  • [32] Boosting with Adaptive Sampling for Multi-class Classification
    Chen, Jianhua
    2015 IEEE 14TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2015, : 667 - 672
  • [33] LSTSVM-PBT Multi-class Classification
    Yu, Qing
    Wang, Lihui
    MATERIALS, INFORMATION, MECHANICAL, ELECTRONIC AND COMPUTER ENGINEERING (MIMECE 2016), 2016, : 330 - 334
  • [34] MULTI-CLASS CLASSIFICATION WITH SPRT IN ROBOTIC VISION
    MAALI, F
    RELF, GT
    PATTERN RECOGNITION LETTERS, 1988, 7 (03) : 129 - 133
  • [35] An active learning algorithm for multi-class classification
    Liu, Dongjiang
    Liu, Yanbi
    PATTERN ANALYSIS AND APPLICATIONS, 2019, 22 (03) : 1051 - 1063
  • [36] Multi-class Twitter sentiment classification with emojis
    Li, Mengdi
    Ch'ng, Eugene
    Chong, Alain Yee Loong
    See, Simon
    INDUSTRIAL MANAGEMENT & DATA SYSTEMS, 2018, 118 (09) : 1804 - 1820
  • [37] Multi-class classification in nonparametric active learning
    Njike, Boris Ndjia
    Siebert, Xavier
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [38] ERM learning algorithm for multi-class classification
    Wang, Cheng
    Guo, Zheng-Chu
    APPLICABLE ANALYSIS, 2012, 91 (07) : 1339 - 1349
  • [39] Progressive Learning Strategies for Multi-class Classification
    Er, Meng Joo
    Venkatesan, Rajasekar
    Wang, Ning
    Chien, Chiang-Ju
    2017 INTERNATIONAL AUTOMATIC CONTROL CONFERENCE (CACS), 2017,
  • [40] Multi-class Sentiment Classification for Customers' Reviews
    Cuong T V Nguyen
    Anh M Tran
    Thao Nguyen
    Trung T Nguyen
    Binh T Nguyen
    ADVANCES AND TRENDS IN ARTIFICIAL INTELLIGENCE: THEORY AND PRACTICES IN ARTIFICIAL INTELLIGENCE, 2022, 13343 : 583 - 593