Variational Learning on Aggregate Outputs with Gaussian Processes

被引:0
|
作者
Law, Ho Chung Leon [1 ]
Sejdinovic, Dino [1 ,6 ]
Cameron, Ewan [2 ]
Lucas, Tim C. D. [2 ]
Flaxman, Seth [3 ,4 ]
Battle, Katherine [2 ]
Fukumizu, Kenji [5 ]
机构
[1] Univ Oxford, Dept Stat, Oxford, England
[2] Univ Oxford, Big Data Inst, Oxford, England
[3] Imperial Coll London, Dept Math, London, England
[4] Imperial Coll London, Data Sci Inst, London, England
[5] Inst Stat Math, Tachikawa, Tokyo, Japan
[6] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
PLASMODIUM-FALCIPARUM; AFRICA;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While a typical supervised learning framework assumes that the inputs and the outputs are measured at the same levels of granularity, many applications, including global mapping of disease, only have access to outputs at a much coarser level than that of the inputs. Aggregation of outputs makes generalization to new inputs much more difficult. We consider an approach to this problem based on variational learning with a model of output aggregation and Gaussian processes, where aggregation leads to intractability of the standard evidence lower bounds. We propose new bounds and tractable approximations, leading to improved prediction accuracy and scalability to large datasets, while explicitly taking uncertainty into account. We develop a framework which extends to several types of likelihoods, including the Poisson model for aggregated count data. We apply our framework to a challenging and important problem, the fine-scale spatial modelling of malaria incidence, with over 1 million observations.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Variational learning for Gaussian mixture models
    Nasios, Nikolaos
    Bors, Adrian G.
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (04): : 849 - 862
  • [22] Variational Gaussian Processes For Linear Inverse Problems
    Randrianarisoa, Thibault
    Szabo, Botond
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [23] Spatio-Temporal Variational Gaussian Processes
    Hamelijnck, Oliver
    Wilkinson, William J.
    Loppi, Niki A.
    Solin, Arno
    Damoulas, Theodoros
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [24] Variational Gaussian Processes: A Functional Analysis View
    Wild, Veit
    Wynne, George
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [25] PHYSICS-BASED COVARIANCE MODELS FOR GAUSSIAN PROCESSES WITH MULTIPLE OUTPUTS
    Constantinescu, Emil M.
    Anitescu, Mihai
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2013, 3 (01) : 47 - 71
  • [26] A variational hardcut EM algorithm for the mixtures of Gaussian processes
    Tao Li
    Jinwen Ma
    Science China Information Sciences, 2023, 66
  • [27] A variational hardcut EM algorithm for the mixtures of Gaussian processes
    Tao LI
    Jinwen MA
    ScienceChina(InformationSciences), 2023, 66 (03) : 291 - 292
  • [28] A variational hardcut EM algorithm for the mixtures of Gaussian processes
    Li, Tao
    Ma, Jinwen
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (03)
  • [29] Doubly Stochastic Variational Inference for Deep Gaussian Processes
    Salimbeni, Hugh
    Deisenroth, Marc Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [30] A stochastic variational framework for Recurrent Gaussian Processes models
    Mattos, Cesar Lincoln C.
    Barreto, Guilherme A.
    NEURAL NETWORKS, 2019, 112 : 54 - 72