EFFICIENT CALIBRATION FOR HIGH-DIMENSIONAL COMPUTER MODEL OUTPUT USING BASIS METHODS

被引:8
|
作者
Salter, James M. [1 ]
Williamson, Daniel B. [1 ,2 ]
机构
[1] Univ Exeter, Dept Math, Exeter, Devon, England
[2] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
uncertainty quantification; dimension reduction; history matching; emulation; basis rotation; BAYESIAN CALIBRATION; UNCERTAINTY; EMULATORS;
D O I
10.1615/Int.J.UncertaintyQuantification.2022042145
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Calibration of expensive computer models using emulators for high-dimensional output fields can become increasingly intractable with the size of the field(s) being compared to observational data. In these settings, dimension reduction is attractive, reducing the number of emulators required to mimic the field(s) by orders of magnitude. By comparing to popular independent emulation approaches that fit univariate emulators to each grid cell in the output field, we demonstrate that using a basis structure for emulation, aside from the clear computational benefits, is essential for obtaining coherent draws that can be compared with data or used in prediction. We show that calibrating on the subspace spanned by the basis is not generally equivalent to calibrating on the full field (the latter being generally infeasible owing to the large number of matrix inversions required for calibration and the size of the matrices on the full field). We then present a projection that allows accurate calibration on the field for exactly the cost of calibrating in the subspace, by projecting in the norm induced by our uncertainties in observations and model discrepancy and given a one-off inversion of a large matrix. We illustrate the benefits of our approach and compare with standard univariate approaches for emulating and calibrating the high-dimensional ice sheet model Glimmer.
引用
收藏
页码:47 / 69
页数:23
相关论文
共 50 条
  • [21] Scale calibration for high-dimensional robust regression
    Loh, Po-Ling
    ELECTRONIC JOURNAL OF STATISTICS, 2021, 15 (02): : 5933 - 5994
  • [22] High-dimensional linear discriminant analysis using nonparametric methods
    Park, Hoyoung
    Baek, Seungchul
    Park, Junyong
    JOURNAL OF MULTIVARIATE ANALYSIS, 2022, 188
  • [23] Efficient high-dimensional importance sampling
    Richard, Jean-Francois
    Zhang, Wei
    JOURNAL OF ECONOMETRICS, 2007, 141 (02) : 1385 - 1411
  • [24] Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs
    Liao, Qifeng
    Lin, Guang
    JOURNAL OF COMPUTATIONAL PHYSICS, 2016, 317 : 148 - 164
  • [25] EFFICIENT SPECTRAL SPARSE GRID METHODS AND APPLICATIONS TO HIGH-DIMENSIONAL ELLIPTIC PROBLEMS
    Shen, Jie
    Yu, Haijun
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2010, 32 (06): : 3228 - 3250
  • [26] Pass-efficient methods for compression of high-dimensional turbulent flow data
    Dunton, Alec M.
    Jofre, Lluis
    Iaccarino, Gianluca
    Doostan, Alireza
    JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 423
  • [27] High-Dimensional Estimation, Basis Assets, and the Adaptive Multi-Factor Model
    Zhu, Liao
    Basu, Sumanta
    Jarrow, Robert A.
    Wells, Martin T.
    QUARTERLY JOURNAL OF FINANCE, 2020, 10 (04)
  • [28] Linear screening for high-dimensional computer experiments
    Li, Chunya
    Chen, Daijun
    Xiong, Shifeng
    STAT, 2021, 10 (01):
  • [29] Efficient Multiclass Classification Using Feature Selection in High-Dimensional Datasets
    Kumar, Ankur
    Kaur, Avinash
    Singh, Parminder
    Driss, Maha
    Boulila, Wadii
    ELECTRONICS, 2023, 12 (10)
  • [30] Model averaging in calibration of near-infrared instruments with correlated high-dimensional data
    Salaki, Deiby Tineke
    Kurnia, Anang
    Sartono, Bagus
    Mangku, I. Wayan
    Gusnanto, Arief
    JOURNAL OF APPLIED STATISTICS, 2024, 51 (02) : 279 - 297