Kriging hyperparameter tuning strategies

被引:128
|
作者
Toal, David J. J. [1 ]
Bressloff, Neil W. [1 ]
Keane, Andy J. [1 ]
机构
[1] Univ Southampton, Sch Engn Sci, Southampton SO17 1BJ, Hants, England
关键词
D O I
10.2514/1.34822
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
Response surfaces have been extensively used as a method of building effective surrogate models of high-fidelity computational simulations. Of the numerous types of response surface models, kriging is perhaps one of the most effective, due to its ability to model complicated responses through interpolation or regression of known data while providing an estimate of the error in its prediction. There is, however, little information indicating the extent to which the hyperparameters of a kriging model need to be tuned for the resulting surrogate model to be effective. The following paper addresses this issue by investigating how often and how well it is necessary to tune the hyperparameters of a kriging model as it is updated during an optimization process. To this end, an optimization benchmarking procedure is introduced and used to assess the performance of five different tuning strategies over a range of problem sizes. The results of this benchmark demonstrate the performance gains that can be associated with reducing the complexity of the hyperparameter tuning process for complicated design problems. The strategy of tuning hyperparameters only once after the initial design of experiments is shown to perform poorly.
引用
收藏
页码:1240 / 1252
页数:13
相关论文
共 50 条
  • [1] The development of a hybridized particle swarm for kriging hyperparameter tuning
    Toal, D. J. J.
    Bressloff, N. W.
    Keane, A. J.
    Holden, C. M. E.
    ENGINEERING OPTIMIZATION, 2011, 43 (06) : 675 - 699
  • [2] Restart strategies enabling automatic differentiation for hyperparameter tuning in inverse problems
    Davy, Leo
    Pustelnik, Nelly
    Abry, Patrice
    32ND EUROPEAN SIGNAL PROCESSING CONFERENCE, EUSIPCO 2024, 2024, : 1811 - 1815
  • [3] Elastic Hyperparameter Tuning on the Cloud
    Dunlap, Lisa
    Kandasamy, Kirthevasan
    Misra, Ujval
    Liaw, Richard
    Jordan, Michael
    Stoica, Ion
    Gonzalez, Joseph E.
    PROCEEDINGS OF THE 2021 ACM SYMPOSIUM ON CLOUD COMPUTING (SOCC '21), 2021, : 33 - 46
  • [4] Hyperparameter Tuning of ConvLSTM Network Models
    Vrskova, Roberta
    Sykora, Peter
    Kamencay, Patrik
    Hudec, Robert
    Radil, Roman
    2021 44TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS AND SIGNAL PROCESSING (TSP), 2021, : 15 - 18
  • [5] Game AI Hyperparameter Tuning in Rinascimento
    Bravi, Ivan
    Volz, Vanessa
    Lucas, Simon
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, : 1742 - 1746
  • [6] Hyperparameter Tuning in Echo State Networks
    Matzner, Filip
    PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'22), 2022, : 404 - 412
  • [7] Refining the ONCE Benchmark With Hyperparameter Tuning
    Golyadkin, Maksim
    Gambashidze, Alexander
    Nurgaliev, Ildar
    Makarov, Ilya
    IEEE ACCESS, 2024, 12 : 3805 - 3814
  • [8] On the Performance of Differential Evolution for Hyperparameter Tuning
    Schmidt, Mischa
    Safarani, Shand
    Gastinger, Julia
    Jacobs, Tobias
    Nicolas, Sebastien
    Schuelke, Anett
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [9] On hyperparameter tuning in general clustering problems
    Fan, Xinjie
    Yue, Yuguang
    Sarkar, Purnamrita
    Wang, Y. X. Rachel
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [10] On hyperparameter tuning in general clustering problems
    Fan, Xinjie
    Yue, Yuguang
    Sarkar, Purnamrita
    Wang, Y. X. Rachel
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119