Learning GP-BayesFilters via Gaussian process latent variable models

被引:0
|
作者
Jonathan Ko
Dieter Fox
机构
[1] University of Washington,Department of Computer Science & Engineering
[2] Intel Corp.,Intel Labs Seattle
来源
Autonomous Robots | 2011年 / 30卷
关键词
Gaussian process; System identification; Bayesian filtering; Time alignment; System control; Machine learning;
D O I
暂无
中图分类号
学科分类号
摘要
GP-BayesFilters are a general framework for integrating Gaussian process prediction and observation models into Bayesian filtering techniques, including particle filters and extended and unscented Kalman filters. GP-BayesFilters have been shown to be extremely well suited for systems for which accurate parametric models are difficult to obtain. GP-BayesFilters learn non-parametric models from training data containing sequences of control inputs, observations, and ground truth states. The need for ground truth states limits the applicability of GP-BayesFilters to systems for which the ground truth can be estimated without significant overhead. In this paper we introduce GPBF-Learn, a framework for training GP-BayesFilters without ground truth states. Our approach extends Gaussian Process Latent Variable Models to the setting of dynamical robotics systems. We show how weak labels for the ground truth states can be incorporated into the GPBF-Learn framework. The approach is evaluated using a difficult tracking task, namely tracking a slotcar based on inertial measurement unit (IMU) observations only. We also show some special features enabled by this framework, including time alignment, and control replay for both the slotcar, and a robotic arm.
引用
收藏
页码:3 / 23
页数:20
相关论文
共 50 条
  • [41] Sequential process convolution Gaussian process models via particle learning
    Liang, Waley W. J.
    Lee, Herbert K. H.
    STATISTICS AND ITS INTERFACE, 2014, 7 (04) : 465 - 475
  • [42] Latent variable Gaussian process models: A rank-based analysis and an alternative approach
    Tao, Siyu
    Apley, Daniel W.
    Plumlee, Matthew
    Chen, Wei
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2021, 122 (15) : 4007 - 4026
  • [43] Surface Approximation by Means of Gaussian Process Latent Variable Models and Line Element Geometry
    De Boi, Ivan
    Ek, Carl Henrik
    Penne, Rudi
    MATHEMATICS, 2023, 11 (02)
  • [44] Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models
    Martens, Kaspar
    Campbell, Kieran R.
    Yau, Christopher
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [45] Covariate dimension reduction for survival data via the Gaussian process latent variable model
    Barrett, James E.
    Coolen, Anthony C. C.
    STATISTICS IN MEDICINE, 2016, 35 (08) : 1340 - 1353
  • [46] Graph learning for latent-variable Gaussian graphical models under laplacian constraints
    Li, Ran
    Lin, Jiming
    Qiu, Hongbing
    Zhang, Wenhui
    Wang, Junyi
    NEUROCOMPUTING, 2023, 532 : 67 - 76
  • [47] Gaussian Process Latent Force Models for Learning and Stochastic Control of Physical Systems
    Sarkka, Simo
    Alvarez, Mauricio A.
    Lawrence, Neil D.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2019, 64 (07) : 2953 - 2960
  • [48] A Gaussian Process Latent Variable Model for Subspace Clustering
    Li, Shangfang
    COMPLEXITY, 2021, 2021
  • [49] A Gaussian Process Latent Variable Model for BRDF Inference
    Georgoulis, Stamatios
    Vanweddingen, Vincent
    Proesmans, Marc
    Van Gool, Luc
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 3559 - 3567
  • [50] Multimodal Similarity Gaussian Process Latent Variable Model
    Song, Guoli
    Wang, Shuhui
    Huang, Qingming
    Tian, Qi
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (09) : 4168 - 4181