Scalable Computation of Predictive Probabilities in Probit Models with Gaussian Process Priors

被引:2
|
作者
Cao, Jian [1 ]
Durante, Daniele [2 ,3 ]
Genton, Marc G. [1 ]
机构
[1] King Abdullah Univ Sci & Technol, Stat Program, Thuwal, Saudi Arabia
[2] Bocconi Univ, Dept Decis Sci, Milan, Italy
[3] Bocconi Univ, Bocconi Inst Data Sci & Analyt, Milan, Italy
关键词
Binary data; Gaussian process; Multivariate truncated normal; Probit model; Unified skew-normal; Variational Bayes; CONDITIONING APPROXIMATIONS; BAYESIAN-INFERENCE; REGRESSION; BINARY; CLASSIFICATION; SIMULATION;
D O I
10.1080/10618600.2022.2036614
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Predictive models for binary data are fundamental in various fields, and the growing complexity of modern applications has motivated several flexible specifications for modeling the relationship between the observed predictors and the binary responses. A widely-implemented solution is to express the probability parameter via a probit mapping of a Gaussian process indexed by predictors. However, unlike for continuous settings, there is a lack of closed-form results for predictive distributions in binary models with Gaussian process priors. Markov chain Monte Carlo methods and approximation strategies provide common solutions to this problem, but state-of-the-art algorithms are either computationally intractable or inaccurate in moderate-to-high dimensions. In this article, we aim to cover this gap by deriving closed-form expressions for the predictive probabilities in probit Gaussian processes that rely either on cumulative distribution functions of multivariate Gaussians or on functionals of multivariate truncated normals. To evaluate these quantities we develop novel scalable solutions based on tile-low-rank Monte Carlo methods for computing multivariate Gaussian probabilities, and on mean-field variational approximations of multivariate truncated normals. Closed-form expressions for the marginal likelihood and for the posterior distribution of the Gaussian process are also discussed. As shown in simulated and real-world empirical studies, the proposed methods scale to dimensions where state-of-the-art solutions are impractical.
引用
收藏
页码:709 / 720
页数:12
相关论文
共 50 条
  • [41] Risk-Sensitive Model Predictive Control with Gaussian Process Models
    Yang, Xiaoke
    Maciejowski, Jan
    IFAC PAPERSONLINE, 2015, 48 (28): : 374 - 379
  • [42] Divide & conquer identification using Gaussian process priors
    Leith, DJ
    Leithead, WE
    Solak, E
    Murray-Smith, R
    PROCEEDINGS OF THE 41ST IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-4, 2002, : 624 - 629
  • [43] Scalable Gaussian Process Variational Autoencoders
    Jazbec, Metod
    Ashman, Matthew
    Fortuin, Vincent
    Pearce, Michael
    Mandt, Stephan
    Raetsch, Gunnar
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [44] Lower bounds for posterior rates with Gaussian process priors
    Castillo, Ismael
    ELECTRONIC JOURNAL OF STATISTICS, 2008, 2 : 1281 - 1299
  • [45] Eliciting Gaussian process priors for complex computer codes
    Oakley, J
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES D-THE STATISTICIAN, 2002, 51 : 81 - 97
  • [46] Scalable Variational Gaussian Process Classification
    Hensman, James
    Matthews, Alex G. de G.
    Ghahramani, Zoubin
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 351 - 360
  • [47] SCALABLE GAUSSIAN PROCESS FOR EXTREME CLASSIFICATION
    Dhaka, Akash Kumar
    Andersen, Michael Riis
    Moreno, Pablo Garcia
    Vehtari, Aki
    PROCEEDINGS OF THE 2020 IEEE 30TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2020,
  • [48] Physiological Gaussian process priors for the hemodynamics in fMRI analysis
    Wilzen, Josef
    Eklund, Anders
    Villani, Mattias
    JOURNAL OF NEUROSCIENCE METHODS, 2020, 342
  • [49] Gaussian Process Priors for View-Aware Inference
    Hou, Yuxin
    Heljakka, Ari
    Solin, Arno
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7762 - 7770
  • [50] Scalable Gaussian Process Regression Networks
    Li, Shibo
    Xing, Wei
    Kirby, Robert M.
    Zhe, Shandian
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 2456 - 2462