Variational Hierarchical Mixtures for Probabilistic Learning of Inverse Dynamics

被引:0
|
作者
Abdulsamad, Hany [1 ]
Nickl, Peter [2 ]
Klink, Pascal [3 ]
Peters, Jan [3 ]
机构
[1] Aalto Univ, Dept Elect Engn & Automat, Espoo 02150, Finland
[2] RIKEN Ctr Adv Intelligence Project, Chuo City 1030027, Japan
[3] Tech Univ Darmstadt, Dept Comp Sci, D-64289 Darmstadt, Germany
基金
欧盟地平线“2020”;
关键词
Bayes methods; Data models; Computational modeling; Uncertainty; Mixture models; Manipulator dynamics; Neural networks; Dirichlet process mixtures; generative models; hierarchical local regression; inverse dynamics control; SAMPLING METHODS; INFERENCE; NETWORKS; MODELS;
D O I
10.1109/TPAMI.2023.3314670
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Well-calibrated probabilistic regression models are a crucial learning component in robotics applications as datasets grow rapidly and tasks become more complex. Unfortunately, classical regression models are usually either probabilistic kernel machines with a flexible structure that does not scale gracefully with data or deterministic and vastly scalable automata, albeit with a restrictive parametric form and poor regularization. In this paper, we consider a probabilistic hierarchical modeling paradigm that combines the benefits of both worlds to deliver computationally efficient representations with inherent complexity regularization. The presented approaches are probabilistic interpretations of local regression techniques that approximate nonlinear functions through a set of local linear or polynomial units. Importantly, we rely on principles from Bayesian nonparametrics to formulate flexible models that adapt their complexity to the data and can potentially encompass an infinite number of components. We derive two efficient variational inference techniques to learn these representations and highlight the advantages of hierarchical infinite local regression models, such as dealing with non-smooth functions, mitigating catastrophic forgetting, and enabling parameter sharing and fast predictions. Finally, we validate this approach on large inverse dynamics datasets and test the learned models in real-world control scenarios.
引用
收藏
页码:1950 / 1963
页数:14
相关论文
共 50 条
  • [21] Introducing the Theory of Probabilistic Hierarchical Learning for Classification
    Ursani, Ziauddin
    Dicks, Jo
    ADVANCES AND TRENDS IN ARTIFICIAL INTELLIGENCE: FROM THEORY TO PRACTICE, 2019, 11606 : 628 - 641
  • [22] Scalable Inverse Uncertainty Quantification by Hierarchical Bayesian Modeling and Variational Inference
    Wang, Chen
    Wu, Xu
    Xie, Ziyu
    Kozlowski, Tomasz
    ENERGIES, 2023, 16 (22)
  • [23] Variational Online Learning of Neural Dynamics
    Zhao, Yuan
    Park, Il Memming
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14
  • [24] Efficient Model Selection for Mixtures of Probabilistic PCA via Hierarchical BIC
    Zhao, Jianhua
    IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (10) : 1871 - 1883
  • [25] Efficient Probabilistic Performance Bounds for Inverse Reinforcement Learning
    Brown, Daniel S.
    Niekum, Scott
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 2754 - 2762
  • [26] Blind separation of nonlinear mixtures by variational Bayesian learning
    Honkela, Antti
    Valpola, Harri
    Ilin, Alexander
    Karhunen, Juha
    DIGITAL SIGNAL PROCESSING, 2007, 17 (05) : 914 - 934
  • [27] Memorized Variational Continual Learning for Dirichlet Process Mixtures
    Yang, Yang
    Chen, Bo
    Liu, Hongwei
    IEEE ACCESS, 2019, 7 : 150851 - 150862
  • [28] Regularized dynamics for monotone inverse variational inequalities in hilbert spaces
    Anh, Pham Ky
    Hai, Trinh Ngoc
    OPTIMIZATION AND ENGINEERING, 2024, 25 (04) : 2295 - 2313
  • [29] Probabilistic hydrological forecasting based on variational Bayesian deep learning
    Li D.
    Yao Y.
    Liang Z.
    Zhou Y.
    Li B.
    Shuikexue Jinzhan/Advances in Water Science, 2023, 34 (01): : 33 - 41
  • [30] Variational Inference on Infinite Mixtures of Inverse Gaussian, Multinomial Probit and Exponential Regression
    Sk, Minhazul Islam
    Banerjee, Arunava
    2014 13TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2014, : 276 - 281