Kriging hyperparameter tuning strategies

被引:128
|
作者
Toal, David J. J. [1 ]
Bressloff, Neil W. [1 ]
Keane, Andy J. [1 ]
机构
[1] Univ Southampton, Sch Engn Sci, Southampton SO17 1BJ, Hants, England
关键词
D O I
10.2514/1.34822
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
Response surfaces have been extensively used as a method of building effective surrogate models of high-fidelity computational simulations. Of the numerous types of response surface models, kriging is perhaps one of the most effective, due to its ability to model complicated responses through interpolation or regression of known data while providing an estimate of the error in its prediction. There is, however, little information indicating the extent to which the hyperparameters of a kriging model need to be tuned for the resulting surrogate model to be effective. The following paper addresses this issue by investigating how often and how well it is necessary to tune the hyperparameters of a kriging model as it is updated during an optimization process. To this end, an optimization benchmarking procedure is introduced and used to assess the performance of five different tuning strategies over a range of problem sizes. The results of this benchmark demonstrate the performance gains that can be associated with reducing the complexity of the hyperparameter tuning process for complicated design problems. The strategy of tuning hyperparameters only once after the initial design of experiments is shown to perform poorly.
引用
收藏
页码:1240 / 1252
页数:13
相关论文
共 50 条
  • [21] Hyperparameter Tuning is All You Need for LISTA
    Chen, Xiaohan
    Liu, Jialin
    Wang, Zhangyang
    Yin, Wotao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [22] Automatic hyperparameter tuning for support vector machines
    Anguita, D
    Ridella, S
    Rivieccio, F
    Zunino, R
    ARTIFICIAL NEURAL NETWORKS - ICANN 2002, 2002, 2415 : 1345 - 1350
  • [23] Automatic Hyperparameter Tuning in Sparse Matrix Factorization
    Kawasumi, Ryota
    Takeda, Koujin
    NEURAL COMPUTATION, 2023, 35 (06) : 1086 - 1099
  • [24] Hyperparameter self-tuning for data streams
    Veloso, Bruno
    Gama, Joao
    Malheiro, Benedita
    Vinagre, Joao
    INFORMATION FUSION, 2021, 76 : 75 - 86
  • [25] The Statistical Cost of Robust Kernel Hyperparameter Tuning
    Meyer, Raphael A.
    Musco, Christopher
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [26] RubberBand: Cloud-based Hyperparameter Tuning
    Misra, Ujval
    Liaw, Richard
    Dunlap, Lisa
    Bhardwaj, Romil
    Kandasamy, Kirthevasan
    Gonzalez, Joseph E.
    Stoica, Ion
    Tumanov, Alexey
    PROCEEDINGS OF THE SIXTEENTH EUROPEAN CONFERENCE ON COMPUTER SYSTEMS (EUROSYS '21), 2021, : 327 - 342
  • [27] Analysis of Hyperparameter Tuning in Neural Style Transfer
    Khandelwal, Siddhant
    Pandey, Kavita
    Rana, Sarthak
    Kaushik, Prashant
    2018 FIFTH INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND GRID COMPUTING (IEEE PDGC), 2018, : 36 - 41
  • [28] Sequential Model-free Hyperparameter Tuning
    Wistuba, Martin
    Schilling, Nicolas
    Schmidt-Thieme, Lars
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 1033 - 1038
  • [29] Hyperparameter elegance: fine-tuning text analysis with enhanced genetic algorithm hyperparameter landscape
    Tripathy, Gyananjaya
    Sharaff, Aakanksha
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (11) : 6761 - 6783
  • [30] Performance evaluation of various hyperparameter tuning strategies for forecasting uncertain parameters used in solving stochastic optimization problems
    Pravin, P. S.
    Tan, Jaswin Zhi Ming
    Wu, Zhe
    2022 IEEE INTERNATIONAL SYMPOSIUM ON ADVANCED CONTROL OF INDUSTRIAL PROCESSES (ADCONIP 2022), 2022, : 301 - 306