Learning GP-BayesFilters via Gaussian process latent variable models

被引:0
|
作者
Jonathan Ko
Dieter Fox
机构
[1] University of Washington,Department of Computer Science & Engineering
[2] Intel Corp.,Intel Labs Seattle
来源
Autonomous Robots | 2011年 / 30卷
关键词
Gaussian process; System identification; Bayesian filtering; Time alignment; System control; Machine learning;
D O I
暂无
中图分类号
学科分类号
摘要
GP-BayesFilters are a general framework for integrating Gaussian process prediction and observation models into Bayesian filtering techniques, including particle filters and extended and unscented Kalman filters. GP-BayesFilters have been shown to be extremely well suited for systems for which accurate parametric models are difficult to obtain. GP-BayesFilters learn non-parametric models from training data containing sequences of control inputs, observations, and ground truth states. The need for ground truth states limits the applicability of GP-BayesFilters to systems for which the ground truth can be estimated without significant overhead. In this paper we introduce GPBF-Learn, a framework for training GP-BayesFilters without ground truth states. Our approach extends Gaussian Process Latent Variable Models to the setting of dynamical robotics systems. We show how weak labels for the ground truth states can be incorporated into the GPBF-Learn framework. The approach is evaluated using a difficult tracking task, namely tracking a slotcar based on inertial measurement unit (IMU) observations only. We also show some special features enabled by this framework, including time alignment, and control replay for both the slotcar, and a robotic arm.
引用
收藏
页码:3 / 23
页数:20
相关论文
共 50 条
  • [31] Shaking Hands in Latent Space Modeling Emotional Interactions with Gaussian Process Latent Variable Models
    Taubert, Nick
    Endres, Dominik
    Christensen, Andrea
    Giese, Martin A.
    KI 2011: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2011, 7006 : 330 - +
  • [32] Shared Gaussian Process Latent Variable Models for Handling Ambiguous Facial Expressions
    Ek, Carl Henrik
    Jaeckel, Peter
    Campbell, Neill
    Lawrence, Neil D.
    Melhuish, Chris
    INTELLIGENT SYSTEMS AND AUTOMATION, 2009, 1107 : 147 - +
  • [33] Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
    Gal, Yarin
    van der Wilk, Mark
    Rasmussen, Carl E.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [34] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    Gadd, C.
    Wade, S.
    Shah, A. A.
    MACHINE LEARNING, 2021, 110 (06) : 1105 - 1143
  • [35] Pseudo-marginal Bayesian inference for Gaussian process latent variable models
    C. Gadd
    S. Wade
    A. A. Shah
    Machine Learning, 2021, 110 : 1105 - 1143
  • [36] Deep Learning of Latent Variable Models for Industrial Process Monitoring
    Kong, Xiangyin
    Ge, Zhiqiang
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (10) : 6778 - 6788
  • [37] Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference
    Lalchand, Vidhi
    Ravuri, Aditya
    Lawrence, Neil D.
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [38] GP-ALPS: Automatic Latent Process Selection for Multi-Output Gaussian Process Models
    Berkovichy, Pavel
    Perim, Eric
    Bruinsmaz, Wessel
    SYMPOSIUM ON ADVANCES IN APPROXIMATE BAYESIAN INFERENCE, VOL 118, 2019, 118
  • [39] Learning and scoring Gaussian latent variable causal models with unknown additive interventions
    Taeb, Armeen
    Gamella, Juan L.
    Heinze-Deml, Christina
    Buhlmann, Peter
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [40] CLASS-IMBALANCED CLASSIFIERS USING ENSEMBLES OF GAUSSIAN PROCESSES AND GAUSSIAN PROCESS LATENT VARIABLE MODELS
    Yang, Liu
    Heiselman, Cassandra
    Quirk, J. Gerald
    Djuric, Petar M.
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3775 - 3779