Variational Hierarchical Mixtures for Probabilistic Learning of Inverse Dynamics

被引:0
|
作者
Abdulsamad, Hany [1 ]
Nickl, Peter [2 ]
Klink, Pascal [3 ]
Peters, Jan [3 ]
机构
[1] Aalto Univ, Dept Elect Engn & Automat, Espoo 02150, Finland
[2] RIKEN Ctr Adv Intelligence Project, Chuo City 1030027, Japan
[3] Tech Univ Darmstadt, Dept Comp Sci, D-64289 Darmstadt, Germany
基金
欧盟地平线“2020”;
关键词
Bayes methods; Data models; Computational modeling; Uncertainty; Mixture models; Manipulator dynamics; Neural networks; Dirichlet process mixtures; generative models; hierarchical local regression; inverse dynamics control; SAMPLING METHODS; INFERENCE; NETWORKS; MODELS;
D O I
10.1109/TPAMI.2023.3314670
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Well-calibrated probabilistic regression models are a crucial learning component in robotics applications as datasets grow rapidly and tasks become more complex. Unfortunately, classical regression models are usually either probabilistic kernel machines with a flexible structure that does not scale gracefully with data or deterministic and vastly scalable automata, albeit with a restrictive parametric form and poor regularization. In this paper, we consider a probabilistic hierarchical modeling paradigm that combines the benefits of both worlds to deliver computationally efficient representations with inherent complexity regularization. The presented approaches are probabilistic interpretations of local regression techniques that approximate nonlinear functions through a set of local linear or polynomial units. Importantly, we rely on principles from Bayesian nonparametrics to formulate flexible models that adapt their complexity to the data and can potentially encompass an infinite number of components. We derive two efficient variational inference techniques to learn these representations and highlight the advantages of hierarchical infinite local regression models, such as dealing with non-smooth functions, mitigating catastrophic forgetting, and enabling parameter sharing and fast predictions. Finally, we validate this approach on large inverse dynamics datasets and test the learned models in real-world control scenarios.
引用
收藏
页码:1950 / 1963
页数:14
相关论文
共 50 条
  • [1] A Variational Infinite Mixture for Probabilistic Inverse Dynamics Learning
    Abdulsamad, Hany
    Nickl, Peter
    Klink, Pascal
    Peters, Jan
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 4216 - 4222
  • [2] Hierarchical Decompositional Mixtures of Variational Autoencoders
    Tan, Ping Liang
    Peharz, Robert
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [3] A variational principle in probabilistic dynamics
    Labeau, PE
    ANNALS OF NUCLEAR ENERGY, 2000, 27 (17) : 1543 - 1575
  • [4] Probabilistic Prediction of Interactive Driving Behavior via Hierarchical Inverse Reinforcement Learning
    Sun, Liting
    Zhan, Wei
    Tomizuka, Masayoshi
    2018 21ST INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2018, : 2111 - 2117
  • [5] Inverse problem regularization with hierarchical variational autoencoders
    Prost, Jean
    Houdard, Antoine
    Almansa, Andr Prime Es
    Papadakis, Nicolas
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 22837 - 22848
  • [6] Probabilistic hierarchical forecasting with deep Poisson mixtures
    Olivares, Kin G.
    Meetei, O. Nganba
    Ma, Ruijun
    Reddy, Rohan
    Cao, Mengfei
    Dicker, Lee
    INTERNATIONAL JOURNAL OF FORECASTING, 2024, 40 (02) : 470 - 489
  • [7] Batch and Online Variational Learning of Hierarchical Pitman-Yor Mixtures of Multivariate Beta Distributions
    Manouchehri, Narges
    Bouguila, Nizar
    Fan, Wentao
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 298 - 303
  • [8] Learning hierarchical probabilistic logic programs
    Arnaud Nguembang Fadja
    Fabrizio Riguzzi
    Evelina Lamma
    Machine Learning, 2021, 110 : 1637 - 1693
  • [9] Learning hierarchical probabilistic logic programs
    Fadja, Arnaud Nguembang
    Riguzzi, Fabrizio
    Lamma, Evelina
    MACHINE LEARNING, 2021, 110 (07) : 1637 - 1693
  • [10] Hierarchical Probabilistic Ultrasound Image Inpainting via Variational Inference
    Hung, Alex Ling Yu
    Sun, Zhiqing
    Chen, Wanwen
    Galeotti, John
    DEEP GENERATIVE MODELS, AND DATA AUGMENTATION, LABELLING, AND IMPERFECTIONS, 2021, 13003 : 83 - 92