Latent Time Neural Ordinary Differential Equations

被引:0
|
作者
Anumasa, Srinivas [1 ]
Srijith, P. K. [1 ]
机构
[1] Indian Inst Technol Hyderabad, Hyderabad, Telangana, India
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural ordinary differential equations (NODE) have been proposed as a continuous depth generalization to popular deep learning models such as Residual networks (ResNets). They provide parameter efficiency and automate the model selection process in deep learning models to some extent. However, they lack the much-required uncertainty modelling and robustness capabilities which are crucial for their use in several real-world applications such as autonomous driving and healthcare. We propose a novel and unique approach to model uncertainty in NODE by considering a distribution over the end-time T of the ODE solver. The proposed approach, latent time NODE (LT-NODE), treats T as a latent variable and apply Bayesian learning to obtain a posterior distribution over T from the data. In particular, we use variational inference to learn an approximate posterior and the model parameters. Prediction is done by considering the NODE representations from different samples of the posterior and can be done efficiently using a single forward pass. As T implicitly defines the depth of a NODE, posterior distribution over T would also help in model selection in NODE. We also propose, adaptive latent time NODE (ALT-NODE), which allow each data point to have a distinct posterior distribution over end-times. ALT-NODE uses amortized variational inference to learn an approximate posterior using inference networks. We demonstrate the effectiveness of the proposed approaches in modelling uncertainty and robustness through experiments on synthetic and several real-world image classification data.
引用
收藏
页码:6010 / 6018
页数:9
相关论文
共 50 条
  • [1] Learning quantum dynamics with latent neural ordinary differential equations
    Choi, Matthew
    Flam-Shepherd, Daniel
    Kyaw, Thi Ha
    Aspuru-Guzik, Alan
    PHYSICAL REVIEW A, 2022, 105 (04)
  • [2] NEURAL ORDINARY DIFFERENTIAL EQUATIONS FOR TIME SERIES RECONSTRUCTION
    Androsov, D. V.
    RADIO ELECTRONICS COMPUTER SCIENCE CONTROL, 2023, (04) : 69 - 75
  • [3] Neural Ordinary Differential Equations
    Chen, Ricky T. Q.
    Rubanova, Yulia
    Bettencourt, Jesse
    Duvenaud, David
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [4] Optical neural ordinary differential equations
    Zhao, Yun
    Chen, Hang
    Lin, Min
    Zhang, Haiou
    Yan, Tao
    Huang, Ruqi
    Lin, Xing
    Dai, Qionghai
    OPTICS LETTERS, 2023, 48 (03) : 628 - 631
  • [5] Whole-heart electromechanical simulations using Latent Neural Ordinary Differential Equations
    Salvador, Matteo
    Strocchi, Marina
    Regazzoni, Francesco
    Augustin, Christoph M.
    Dede', Luca
    Niederer, Steven A.
    Quarteroni, Alfio
    NPJ DIGITAL MEDICINE, 2024, 7 (01)
  • [6] Stiff neural ordinary differential equations
    Kim, Suyong
    Ji, Weiqi
    Deng, Sili
    Ma, Yingbo
    Rackauckas, Christopher
    CHAOS, 2021, 31 (09)
  • [7] Neural Manifold Ordinary Differential Equations
    Lou, Aaron
    Lim, Derek
    Katsman, Isay
    Huang, Leo
    Jiang, Qingxuan
    Lim, Ser-Nam
    De Sa, Christopher
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [8] Derivation and analysis of parallel-in-time neural ordinary differential equations
    Lorin, E.
    ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2020, 88 (10) : 1035 - 1059
  • [9] Derivation and analysis of parallel-in-time neural ordinary differential equations
    E. Lorin
    Annals of Mathematics and Artificial Intelligence, 2020, 88 : 1035 - 1059
  • [10] LFT: Neural Ordinary Differential Equations With Learnable Final-Time
    Pang, Dong
    Le, Xinyi
    Guan, Xinping
    Wang, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (05) : 6918 - 6927