Learning GP-BayesFilters via Gaussian process latent variable models

被引:0
|
作者
Jonathan Ko
Dieter Fox
机构
[1] University of Washington,Department of Computer Science & Engineering
[2] Intel Corp.,Intel Labs Seattle
来源
Autonomous Robots | 2011年 / 30卷
关键词
Gaussian process; System identification; Bayesian filtering; Time alignment; System control; Machine learning;
D O I
暂无
中图分类号
学科分类号
摘要
GP-BayesFilters are a general framework for integrating Gaussian process prediction and observation models into Bayesian filtering techniques, including particle filters and extended and unscented Kalman filters. GP-BayesFilters have been shown to be extremely well suited for systems for which accurate parametric models are difficult to obtain. GP-BayesFilters learn non-parametric models from training data containing sequences of control inputs, observations, and ground truth states. The need for ground truth states limits the applicability of GP-BayesFilters to systems for which the ground truth can be estimated without significant overhead. In this paper we introduce GPBF-Learn, a framework for training GP-BayesFilters without ground truth states. Our approach extends Gaussian Process Latent Variable Models to the setting of dynamical robotics systems. We show how weak labels for the ground truth states can be incorporated into the GPBF-Learn framework. The approach is evaluated using a difficult tracking task, namely tracking a slotcar based on inertial measurement unit (IMU) observations only. We also show some special features enabled by this framework, including time alignment, and control replay for both the slotcar, and a robotic arm.
引用
收藏
页码:3 / 23
页数:20
相关论文
共 50 条
  • [1] Learning GP-BayesFilters via Gaussian process latent variable models
    Ko, Jonathan
    Fox, Dieter
    AUTONOMOUS ROBOTS, 2011, 30 (01) : 3 - 23
  • [2] GP-BayesFilters: Bayesian Filtering Using Gaussian Process Prediction and Observation Models
    Ko, Jonathan
    Fox, Dieter
    2008 IEEE/RSJ INTERNATIONAL CONFERENCE ON ROBOTS AND INTELLIGENT SYSTEMS, VOLS 1-3, CONFERENCE PROCEEDINGS, 2008, : 3471 - 3476
  • [3] GP-BayesFilters: Bayesian filtering using Gaussian process prediction and observation models
    Jonathan Ko
    Dieter Fox
    Autonomous Robots, 2009, 27 : 75 - 90
  • [4] GP-BayesFilters: Bayesian filtering using Gaussian process prediction and observation models
    Ko, Jonathan
    Fox, Dieter
    AUTONOMOUS ROBOTS, 2009, 27 (01) : 75 - 90
  • [5] Harmonized Multimodal Learning with Gaussian Process Latent Variable Models
    Song, Guoli
    Wang, Shuhui
    Huang, Qingming
    Tian, Qi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (03) : 858 - 872
  • [6] A review on Gaussian Process Latent Variable Models
    Li, Ping
    Chen, Songcan
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2016, 1 (04) : 366 - +
  • [7] Ensembles of Gaussian process latent variable models
    Ajirak, Marzieh
    Liu, Yuhao
    Djuric, Petar M.
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1467 - 1471
  • [8] Learning Latent Variable Gaussian Graphical Models
    Meng, Zhaoshi
    Eriksson, Brian
    Hero, Alfred O., III
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1269 - 1277
  • [9] Gaussian Process Latent Variable Alignment Learning
    Kazlauskaite, Ieva
    Ek, Carl Henrik
    Campbell, Neill D. F.
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 748 - 757
  • [10] Gaussian Mixture Modeling with Gaussian Process Latent Variable Models
    Nickisch, Hannes
    Rasmussen, Carl Edward
    PATTERN RECOGNITION, 2010, 6376 : 272 - 282