Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

被引:0
|
作者
Gal, Yarin [1 ]
van der Wilk, Mark [1 ]
Rasmussen, Carl E. [1 ]
机构
[1] Univ Cambridge, Cambridge, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of research. We introduce a novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm. This is done by exploiting the decoupling of the data given the inducing points to re-formulate the evidence lower bound in a Map-Reduce setting. We show that the inference scales well with data and computational resources, while preserving a balanced distribution of the load among the nodes. We further demonstrate the utility in scaling Gaussian processes to big data. We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST). The results show that GPs perform better than many common models often used for big data.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models
    Trong Nghia Hoang
    Quang Minh Hoang
    Low, Bryan Kian Hsiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [2] Variational inference for sparse spectrum Gaussian process regression
    Tan, Linda S. L.
    Ong, Victor M. H.
    Nott, David J.
    Jasra, Ajay
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1243 - 1261
  • [3] Variational inference for sparse spectrum Gaussian process regression
    Linda S. L. Tan
    Victor M. H. Ong
    David J. Nott
    Ajay Jasra
    Statistics and Computing, 2016, 26 : 1243 - 1261
  • [4] Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference
    Lalchand, Vidhi
    Ravuri, Aditya
    Lawrence, Neil D.
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [5] Sparse Variational Inference for Generalized Gaussian Process Models
    Sheth, Rishit
    Wang, Yuyang
    Khardon, Roni
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1302 - 1311
  • [6] Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression
    Yu, Haibin
    Trong Nghia Hoang
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [7] Fully Bayesian Inference for Latent Variable Gaussian Process Models
    Yerramilli, Suraj
    Iyer, Akshay
    Chen, Wei
    Apley, Daniel W.
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2023, 11 (04): : 1357 - 1381
  • [8] A Unifying Framework of Anytime Sparse Gaussian Process Regression Models with Stochastic Variational Inference for Big Data
    Hoang, Trong Nghia
    Hoang, Quang Minh
    Low, Kian Hsiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 569 - 578
  • [9] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    Journal of Machine Learning Research, 2020, 21
  • [10] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21