Kriging hyperparameter tuning strategies

被引:128
|
作者
Toal, David J. J. [1 ]
Bressloff, Neil W. [1 ]
Keane, Andy J. [1 ]
机构
[1] Univ Southampton, Sch Engn Sci, Southampton SO17 1BJ, Hants, England
关键词
D O I
10.2514/1.34822
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
Response surfaces have been extensively used as a method of building effective surrogate models of high-fidelity computational simulations. Of the numerous types of response surface models, kriging is perhaps one of the most effective, due to its ability to model complicated responses through interpolation or regression of known data while providing an estimate of the error in its prediction. There is, however, little information indicating the extent to which the hyperparameters of a kriging model need to be tuned for the resulting surrogate model to be effective. The following paper addresses this issue by investigating how often and how well it is necessary to tune the hyperparameters of a kriging model as it is updated during an optimization process. To this end, an optimization benchmarking procedure is introduced and used to assess the performance of five different tuning strategies over a range of problem sizes. The results of this benchmark demonstrate the performance gains that can be associated with reducing the complexity of the hyperparameter tuning process for complicated design problems. The strategy of tuning hyperparameters only once after the initial design of experiments is shown to perform poorly.
引用
收藏
页码:1240 / 1252
页数:13
相关论文
共 50 条
  • [31] Everyone's a Winner! On Hyperparameter Tuning of Recommendation Models
    Shehzad, Faisal
    Jannach, Dietmar
    PROCEEDINGS OF THE 17TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2023, 2023, : 652 - 657
  • [32] A stochastic optimization technique for hyperparameter tuning in reservoir computing
    Mwamsojo, Nickson
    Lehmann, Frederic
    Merghem, Kamel
    Frignac, Yann
    Benkelfat, Badr-Eddine
    NEUROCOMPUTING, 2024, 574
  • [33] Population-Based Hyperparameter Tuning With Multitask Collaboration
    Li, Wendi
    Wang, Ting
    Ng, Wing W. Y.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5719 - 5731
  • [34] Fast Efficient Hyperparameter Tuning for Policy Gradient Methods
    Paul, Supratik
    Kurin, Vitaly
    Whiteson, Shimon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [35] Hyperparameter Tuning over an Attention Model for Image Captioning
    Castro, Roberto
    Pineda, Israel
    Eugenio Morocho-Cayamcela, Manuel
    INFORMATION AND COMMUNICATION TECHNOLOGIES (TICEC 2021), 2021, 1456 : 172 - 183
  • [36] Weighted Sampling for Combined Model Selection and Hyperparameter Tuning
    Sarigiannis, Dimitrios
    Parnell, Thomas P.
    Pozidis, Haralampos
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5595 - 5603
  • [37] Hyperparameter Tuning for Big Data using Bayesian Optimisation
    Joy, Tinu Theckel
    Rana, Santu
    Gupta, Sunil
    Venkatesh, Svetha
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 2574 - 2579
  • [38] Fast Model Selection and Hyperparameter Tuning for Generative Models
    Chen, Luming
    Ghosh, Sujit K.
    ENTROPY, 2024, 26 (02)
  • [39] Hyperparameter Tuning for Medicare Fraud Detection in Big Data
    Hancock J.T.
    Khoshgoftaar T.M.
    SN Computer Science, 3 (6)
  • [40] Algorithms for Hyperparameter Tuning of LSTMs for Time Series Forecasting
    Dhake, Harshal
    Kashyap, Yashwant
    Kosmopoulos, Panagiotis
    REMOTE SENSING, 2023, 15 (08)