Gradient Boosted Trees and Denoising Autoencoder to Correct Numerical Wave Forecasts

被引:1
|
作者
Yanchin, Ivan [1 ]
Soares, C. Guedes [1 ]
机构
[1] Univ Lisbon, Ctr Marine Technol & Ocean Engn CENTEC, Inst Super Tecn, Lisbon, Portugal
关键词
significant wave height; wind speed; denoising autoencoders; autoencoders; gradient boosting; machine learning; MODEL; COASTAL;
D O I
10.3390/jmse12091573
中图分类号
U6 [水路运输]; P75 [海洋工程];
学科分类号
0814 ; 081505 ; 0824 ; 082401 ;
摘要
This paper is dedicated to correcting the WAM/ICON numerical wave model predictions by reducing the residue between the model's predictions and the actual buoy observations. The two parameters used in this paper are significant wave height and wind speed. The paper proposes two machine learning models to solve this task. Both models are multioutput models and correct the significant wave height and wind speed simultaneously. The first machine learning model is based on gradient boosted trees, which is trained to predict the residue between the model's forecasts and the actual buoy observations using the other parameters predicted by the numerical model as inputs. This paper demonstrates that this model can significantly reduce errors for all used geographical locations. This paper also uses SHapley Additive exPlanation values to investigate the influence that the numerically predicted wave parameters have when the machine learning model predicts the residue. To design the second model, it is assumed that the residue can be modelled as noise added to the actual values. Therefore, this paper proposes to use the denoising autoencoder to remove this noise from the numerical model's prediction. The results demonstrate that denoising autoencoders can remove the noise for the wind speed parameter, but their performance is poor for the significant wave height. This paper provides some explanations as to why this may happen.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] GB-CENT: Gradient Boosted Categorical Embedding and Numerical Trees
    Zhao, Qian
    Shi, Yue
    Hong, Liangjie
    PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'17), 2017, : 1311 - 1319
  • [2] IDAE: Imputation-boosted Denoising Autoencoder for Collaborative Filtering
    Lee, Jae-woong
    Lee, Jongwuk
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2143 - 2146
  • [3] Gradient Boosted Trees for Corrective Learning
    Oguz, Baris U.
    Shinohara, Russell T.
    Yushkevich, Paul A.
    Oguz, Ipek
    MACHINE LEARNING IN MEDICAL IMAGING (MLMI 2017), 2017, 10541 : 203 - 211
  • [4] Counting People using Gradient Boosted Trees
    Zhou, Bingyin
    Lu, Ming
    Wang, Yonggang
    2016 IEEE INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC), 2016, : 391 - 395
  • [5] Robust Supply Chains with Gradient Boosted Trees
    Mahato, Pradeep K.
    Narayan, Apurva
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 2633 - 2639
  • [6] Gradient boosted trees for evolving data streams
    Nuwan Gunasekara
    Bernhard Pfahringer
    Heitor Gomes
    Albert Bifet
    Machine Learning, 2024, 113 : 3325 - 3352
  • [7] GRADIENT BOOSTED DECISION TREES FOR LITHOLOGY CLASSIFICATION
    Dev, Vikrant A.
    Eden, Mario R.
    PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON FOUNDATIONS OF COMPUTER-AIDED PROCESS DESIGN, 2019, 47 : 113 - 118
  • [8] Block-distributed Gradient Boosted Trees
    Vasiloudis, Theodore
    Cho, Hyunsu
    Bostrom, Henrik
    PROCEEDINGS OF THE 42ND INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '19), 2019, : 1025 - 1028
  • [9] Gradient boosted trees for evolving data streams
    Gunasekara, Nuwan
    Pfahringer, Bernhard
    Gomes, Heitor
    Bifet, Albert
    MACHINE LEARNING, 2024, 113 (05) : 3325 - 3352
  • [10] Leaves on trees: identifying halo stars with extreme gradient boosted trees
    Veljanoski, Jovan
    Helmi, Amina
    Breddels, Maarten
    Posti, Lorenzo
    ASTRONOMY & ASTROPHYSICS, 2018, 621