Surrogate approach to uncertainty quantification of neural networks for regression

被引:4
|
作者
Kang, Myeonginn [1 ]
Kang, Seokho [1 ]
机构
[1] Sungkyunkwan Univ, Dept Ind Engn, 2066 Seobu Ro, Suwon 16419, South Korea
基金
新加坡国家研究基金会;
关键词
Neural network; Uncertainty quantification; Regression; Sensitivity analysis; Surrogate analysis; SENSITIVITY-ANALYSIS; TIME;
D O I
10.1016/j.asoc.2023.110234
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Uncertainty quantification is essential in preventing inaccurate predictions of neural networks. A vanilla neural network for regression does not intrinsically provide explicit information about pre-diction uncertainty. To quantify the prediction uncertainty for regression problems, we can build an alternative prediction model specialized for uncertainty quantification. However, this requires the use of training data, which are inaccessible in many real-world situations. To address such situations, this study presents a surrogate approach to quantify the prediction uncertainty of a regression network without using training data. A regression network tends to have high prediction uncertainty when its output is sensitive to its input. Based on this intuition, we quantify the sensitivity and use it as a surrogate measure of the prediction uncertainty. To do so, we introduce four surrogate measures that capture the sensitivity in different ways: Input perturbation, Gradient norm, MC-dropout, and Knowledge distillation. For a query instance, each surrogate measure can be calculated by using the regression network only to estimate the prediction uncertainty. We demonstrate the respective effectiveness of the proposed surrogate measures on nine regression datasets.& COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Estimation of an improved surrogate model in uncertainty quantification by neural networks
    Goetz, Benedict
    Kersting, Sebastian
    Kohler, Michael
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2021, 73 (02) : 249 - 281
  • [2] Estimation of an improved surrogate model in uncertainty quantification by neural networks
    Benedict Götz
    Sebastian Kersting
    Michael Kohler
    Annals of the Institute of Statistical Mathematics, 2021, 73 : 249 - 281
  • [3] Density regression and uncertainty quantification with Bayesian deep noise neural networks
    Zhang, Daiwei
    Liu, Tianci
    Kang, Jian
    STAT, 2023, 12 (01):
  • [4] Neural networks based surrogate modeling for efficient uncertainty quantification and calibration of MEMS accelerometers
    Zacchei, Filippo
    Rizzini, Francesco
    Gattere, Gabriele
    Frangi, Attilio
    Manzoni, Andrea
    INTERNATIONAL JOURNAL OF NON-LINEAR MECHANICS, 2025, 167
  • [5] Development and analysis of surrogate hybrid functionals using neural networks, uncertainty quantification, and visualization
    Medford, Andrew
    Lei, Xiangyun
    Chau, Polo
    Hohman, Fred
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 255
  • [6] Uncertainty quantification in multivariable regression for material property prediction with Bayesian neural networks
    Li, Longze
    Chang, Jiang
    Vakanski, Aleksandar
    Wang, Yachun
    Yao, Tiankai
    Xian, Min
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [7] Analytically tractable heteroscedastic uncertainty quantification in Bayesian neural networks for regression tasks
    Deka, Bhargob
    Nguyen, Luong Ha
    Goulet, James-A.
    NEUROCOMPUTING, 2024, 572
  • [8] Uncertainty quantification in regression neural networks using evidential likelihood-based inference
    Denoeux, Thierry
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2025, 182
  • [9] Uncertainty Quantification in Regression Neural Networks Using Likelihood-Based Belief Functions
    Denoeux, Thierry
    BELIEF FUNCTIONS: THEORY AND APPLICATIONS, BELIEF 2024, 2024, 14909 : 40 - 48
  • [10] Uncertainty quantification for predictions of atomistic neural networks
    Vazquez-Salazar, Luis Itza
    Boittier, Eric D.
    Meuwly, Markus
    CHEMICAL SCIENCE, 2022, 13 (44) : 13068 - 13084