Variational Learning on Aggregate Outputs with Gaussian Processes

被引:0
|
作者
Law, Ho Chung Leon [1 ]
Sejdinovic, Dino [1 ,6 ]
Cameron, Ewan [2 ]
Lucas, Tim C. D. [2 ]
Flaxman, Seth [3 ,4 ]
Battle, Katherine [2 ]
Fukumizu, Kenji [5 ]
机构
[1] Univ Oxford, Dept Stat, Oxford, England
[2] Univ Oxford, Big Data Inst, Oxford, England
[3] Imperial Coll London, Dept Math, London, England
[4] Imperial Coll London, Data Sci Inst, London, England
[5] Inst Stat Math, Tachikawa, Tokyo, Japan
[6] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
PLASMODIUM-FALCIPARUM; AFRICA;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While a typical supervised learning framework assumes that the inputs and the outputs are measured at the same levels of granularity, many applications, including global mapping of disease, only have access to outputs at a much coarser level than that of the inputs. Aggregation of outputs makes generalization to new inputs much more difficult. We consider an approach to this problem based on variational learning with a model of output aggregation and Gaussian processes, where aggregation leads to intractability of the standard evidence lower bounds. We propose new bounds and tractable approximations, leading to improved prediction accuracy and scalability to large datasets, while explicitly taking uncertainty into account. We develop a framework which extends to several types of likelihoods, including the Poisson model for aggregated count data. We apply our framework to a challenging and important problem, the fine-scale spatial modelling of malaria incidence, with over 1 million observations.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Variational multiple shooting for Bayesian ODEs with Gaussian processes
    Hegde, Pashupati
    Yildiz, Cagatay
    Lahdesmaki, Harri
    Kaski, Samuel
    Heinonen, Markus
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 790 - 799
  • [32] Variational Inference for Gaussian Process Modulated Poisson Processes
    Lloyd, Chris
    Gunter, Tom
    Osborne, Michael A.
    Roberts, Stephen J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1814 - 1822
  • [33] Implicit Posterior Variational Inference for Deep Gaussian Processes
    Yu, Haibin
    Chen, Yizhou
    Dai, Zhongxiang
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [34] Convergence of sparse variational inference in gaussian processes regression
    Burt, David R.
    Rasmussen, Carl Edward
    Van Der Wilk, Mark
    Journal of Machine Learning Research, 2020, 21
  • [35] Variational Inference for Gaussian Processes with Panel Count Data
    Ding, Hongyi
    Lee, Young
    Sato, Issei
    Sugiyama, Masashi
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 290 - 299
  • [36] Convergence of Sparse Variational Inference in Gaussian Processes Regression
    Burt, David R.
    Rasmussen, Carl Edward
    van der Wilk, Mark
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [37] Learning curves for Gaussian processes
    Sollich, P
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 11, 1999, 11 : 344 - 350
  • [38] Variational learning in nonlinear Gaussian belief networks
    Frey, BJ
    Hinton, GE
    NEURAL COMPUTATION, 1999, 11 (01) : 193 - 213
  • [39] Gaussian processes in machine learning
    Rasmussen, CE
    ADVANCED LECTURES ON MACHINE LEARNING, 2004, 3176 : 63 - 71
  • [40] Lifelong Learning with Gaussian Processes
    Clingerman, Christopher
    Eaton, Eric
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2017, PT II, 2017, 10535 : 690 - 704