Altering Gaussian process to Student-t process for maximum distribution construction

被引:4
|
作者
Wang, Weidong [1 ]
Yu, Qin [2 ]
Fasli, Maria [3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Informat & Software Engn, Chengdu 610054, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Informat & Commun Engn, Chengdu, Peoples R China
[3] Univ Essex, Sch Comp Sci & Elect Engn, Inst Data Analyt & Sci, Colchester, Essex, England
基金
中国国家自然科学基金;
关键词
Gaussian process regression; Student-t process regression; Maximum distribution; Sequential Monte Carlo; Bayesian optimisation; PROCESS REGRESSION; OPTIMIZATION;
D O I
10.1080/00207721.2020.1838663
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Gaussian process (GP) regression is widely used to find the extreme of a black-box function by iteratively approximating an objective function when new evaluation obtained. Such evaluation is usually made by optimising a given acquisition function. However, for non-parametric Bayesian optimisation, the extreme of the objective function is not a deterministic value, but a random variant with distribution. We call such distribution the maximum distribution which is generally non-analytical. To construct such maximum distribution, traditional GP regression method by optimising an acquisition function is computational cost as the GP model has a cubic computation of training data. Moreover, the introduction of acquisition function brings extra hyperparameters which made the optimisation more complicated. Recently, inspired by the idea of Sequential Monte Carlo method and its application in Bayesian optimisation, a Monte Carlo alike method is proposed to approximate the maximum distribution with weighted samples. Alternative to the method on GP model, we construct the maximum distribution within the framework of Student-t process (TP) which considers more uncertainties from the training data. Toy examples and real data experiment show the TP-based Monte Carlo maximum distribution has a competitive performance to the GP-based method.
引用
收藏
页码:727 / 755
页数:29
相关论文
共 50 条
  • [1] Robust Gaussian Process Regression with a Student-t Likelihood
    Jylanki, Pasi
    Vanhatalo, Jarno
    Vehtari, Aki
    JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 3227 - 3257
  • [2] Student-t Process Regression with Student-t Likelihood
    Tang, Qingtao
    Niu, Li
    Wang, Yisen
    Dai, Tao
    An, Wangpeng
    Cai, Jianfei
    Xia, Shu-Tao
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2822 - 2828
  • [3] Student-t Process Regression with Dependent Student-t Noise
    Tang, Qingtao
    Wang, Yisen
    Xia, Shu-Tao
    ECAI 2016: 22ND EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, 285 : 82 - 89
  • [4] On the robustness to outliers of the Student-t process
    Andrade, J. Ailton A.
    SCANDINAVIAN JOURNAL OF STATISTICS, 2023, 50 (02) : 725 - 749
  • [5] Multivariate Gaussian and Student-t process regression for multi-output prediction
    Chen, Zexun
    Wang, Bo
    Gorban, Alexander N.
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (08): : 3005 - 3028
  • [6] Multivariate Gaussian and Student-t process regression for multi-output prediction
    Zexun Chen
    Bo Wang
    Alexander N. Gorban
    Neural Computing and Applications, 2020, 32 : 3005 - 3028
  • [7] Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic student-t model
    Marcelo Hartmann
    Jarno Vanhatalo
    Statistics and Computing, 2019, 29 : 753 - 773
  • [8] ON THE DISTRIBUTION OF THE MAXIMUM OF A GAUSSIAN PROCESS
    LIFSHITS, MA
    THEORY OF PROBABILITY AND ITS APPLICATIONS, 1987, 31 (01) : 125 - 132
  • [9] Correction to: Multivariate Gaussian and Student-t process regression for multi-output prediction
    Zexun Chen
    Bo Wang
    Alexander N. Gorban
    Neural Computing and Applications, 2020, 32 : 11963 - 11963
  • [10] Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic student-t model
    Hartmann, Marcelo
    Vanhatalo, Jarno
    STATISTICS AND COMPUTING, 2019, 29 (04) : 753 - 773