Scalable Computation of Predictive Probabilities in Probit Models with Gaussian Process Priors

被引:2
|
作者
Cao, Jian [1 ]
Durante, Daniele [2 ,3 ]
Genton, Marc G. [1 ]
机构
[1] King Abdullah Univ Sci & Technol, Stat Program, Thuwal, Saudi Arabia
[2] Bocconi Univ, Dept Decis Sci, Milan, Italy
[3] Bocconi Univ, Bocconi Inst Data Sci & Analyt, Milan, Italy
关键词
Binary data; Gaussian process; Multivariate truncated normal; Probit model; Unified skew-normal; Variational Bayes; CONDITIONING APPROXIMATIONS; BAYESIAN-INFERENCE; REGRESSION; BINARY; CLASSIFICATION; SIMULATION;
D O I
10.1080/10618600.2022.2036614
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Predictive models for binary data are fundamental in various fields, and the growing complexity of modern applications has motivated several flexible specifications for modeling the relationship between the observed predictors and the binary responses. A widely-implemented solution is to express the probability parameter via a probit mapping of a Gaussian process indexed by predictors. However, unlike for continuous settings, there is a lack of closed-form results for predictive distributions in binary models with Gaussian process priors. Markov chain Monte Carlo methods and approximation strategies provide common solutions to this problem, but state-of-the-art algorithms are either computationally intractable or inaccurate in moderate-to-high dimensions. In this article, we aim to cover this gap by deriving closed-form expressions for the predictive probabilities in probit Gaussian processes that rely either on cumulative distribution functions of multivariate Gaussians or on functionals of multivariate truncated normals. To evaluate these quantities we develop novel scalable solutions based on tile-low-rank Monte Carlo methods for computing multivariate Gaussian probabilities, and on mean-field variational approximations of multivariate truncated normals. Closed-form expressions for the marginal likelihood and for the posterior distribution of the Gaussian process are also discussed. As shown in simulated and real-world empirical studies, the proposed methods scale to dimensions where state-of-the-art solutions are impractical.
引用
收藏
页码:709 / 720
页数:12
相关论文
共 50 条
  • [21] Transformations of Gaussian Process priors for user matching
    Feng, Shimin
    Murray-Smith, Roderick
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2016, 86 : 32 - 47
  • [22] Posterior Contraction for Deep Gaussian Process Priors
    Finocchio, Gianluca
    Schmidt-Hieber, Johannes
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [23] Regression and classification using Gaussian process priors
    Neal, RM
    BAYESIAN STATISTICS 6, 1999, : 475 - 501
  • [24] Gaussian process priors for partial physical insight
    Jones, Matthew R.
    Pitchforth, Daniel James
    Cross, Elizabeth J.
    e-Journal of Nondestructive Testing, 2024, 29 (07):
  • [25] Bayesian inference with rescaled Gaussian process priors
    van der Vaart, Aad
    van Zanten, Harry
    ELECTRONIC JOURNAL OF STATISTICS, 2007, 1 : 433 - 448
  • [26] Bayesian Multitask Classification with Gaussian Process Priors
    Skolidis, Grigorios
    Sanguinetti, Guido
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (12): : 2011 - 2021
  • [27] SGPP: spatial Gaussian predictive process models for neuroimaging data
    Hyun, Jung Won
    Li, Yimei
    Gilmore, John H.
    Lu, Zhaohua
    Styner, Martin
    Zhu, Hongtu
    NEUROIMAGE, 2014, 89 : 70 - 80
  • [28] Adaptive Gaussian predictive process models for large spatial datasets
    Guhaniyogi, Rajarshi
    Finley, Andrew O.
    Banerjee, Sudipto
    Gelfand, Alan E.
    ENVIRONMETRICS, 2011, 22 (08) : 997 - 1007
  • [29] Geometric Priors for Gaussian Process Implicit Surfaces
    Martens, Wolfram
    Poffet, Yannick
    Ramon Soria, Pablo
    Filch, Robert
    Sukkarieh, Salah
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2017, 2 (02): : 373 - 380
  • [30] Approximate Maximum a Posteriori with Gaussian Process Priors
    Markus Hegland
    Constructive Approximation, 2007, 26 : 205 - 224