GAR: Generalized Autoregression for Multi-Fidelity Fusion

被引:0
|
作者
Wang, Yuxin [1 ]
Xing, Zheng [2 ]
Xing, Wei W. [3 ,4 ]
机构
[1] Beihang Univ, Sch Math Sci, Beijing 100191, Peoples R China
[2] Rockchip Elect Co Ltd, Graph & Comp Dept, Fuzhou 350003, Peoples R China
[3] Univ Sheffield, Sch Math & Stat, Sheffield S10 2TN, England
[4] Beihang Univ, Sch Integrated Circuit Sci & Engn, Beijing 100191, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many scientific research and engineering applications where repeated simulations of complex systems are conducted, a surrogate is commonly adopted to quickly estimate the whole system. To reduce the expensive cost of generating training examples, it has become a promising approach to combine the results of low-fidelity (fast but inaccurate) and high-fidelity (slow but accurate) simulations. Despite the fast developments of multi-fidelity fusion techniques, most existing methods require particular data structures and do not scale well to high-dimensional output. To resolve these issues, we generalize the classic autoregression (AR), which is wildly used due to its simplicity, robustness, accuracy, and tractability, and propose generalized autoregression (GAR) using tensor formulation and latent features. GAR can deal with arbitrary dimensional outputs and arbitrary multi-fidelity data structure to satisfy the demand of multi-fidelity fusion for complex problems; it admits a fully tractable likelihood and posterior requiring no approximate inference and scales well to high-dimensional problems. Furthermore, we prove the autokrigeability theorem based on GAR in the multi-fidelity case and develop CIGAR, a simplified GAR with the exact predictive mean accuracy with computation reduction by a factor of d(3), where d is the dimensionality of the output. The empirical assessment includes many canonical PDEs and real scientific examples and demonstrates that the proposed method consistently outperforms the SOTA methods with a large margin (up to 6x improvement in RMSE) with only a couple high-fidelity training samples.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] A generalized hierarchical co-Kriging model for multi-fidelity data fusion
    Qi Zhou
    Yuda Wu
    Zhendong Guo
    Jiexiang Hu
    Peng Jin
    Structural and Multidisciplinary Optimization, 2020, 62 : 1885 - 1904
  • [2] A generalized hierarchical co-Kriging model for multi-fidelity data fusion
    Zhou, Qi
    Wu, Yuda
    Guo, Zhendong
    Hu, Jiexiang
    Jin, Peng
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2020, 62 (04) : 1885 - 1904
  • [3] Multi-fidelity information fusion based on prediction of kriging
    Dong, Huachao
    Song, Baowei
    Wang, Peng
    Huang, Shuai
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2015, 51 (06) : 1267 - 1280
  • [4] Multi-fidelity information fusion based on prediction of kriging
    Huachao Dong
    Baowei Song
    Peng Wang
    Shuai Huang
    Structural and Multidisciplinary Optimization, 2015, 51 : 1267 - 1280
  • [5] Multi-fidelity information fusion with concatenated neural networks
    Pawar, Suraj
    San, Omer
    Vedula, Prakash
    Rasheed, Adil
    Kvamsdal, Trond
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [6] Multi-fidelity information fusion with concatenated neural networks
    Suraj Pawar
    Omer San
    Prakash Vedula
    Adil Rasheed
    Trond Kvamsdal
    Scientific Reports, 12
  • [7] A combined modeling method for complex multi-fidelity data fusion
    Tang, Lei
    Liu, Feng
    Wu, Anping
    Li, Yubo
    Jiang, Wanqiu
    Wang, Qingfeng
    Huang, Jun
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2024, 5 (03):
  • [8] Multi-fidelity information fusion for turbulent transport modeling in magnetic fusion plasma
    Maeyama, Shinya
    Honda, Mitsuru
    Narita, Emi
    Toda, Shinichiro
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [9] A multi-fidelity transfer learning strategy based on multi-channel fusion
    Zhang, Zihan
    Ye, Qian
    Yang, Dejin
    Wang, Na
    Meng, Guoxiang
    JOURNAL OF COMPUTATIONAL PHYSICS, 2024, 506
  • [10] Multi-fidelity optimization method with Asynchronous Generalized Island Model for AutoML
    Jurado, Israel Campero
    Vanschoren, Joaquin
    PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2022, 2022, : 220 - 223