Trade-off between the global and extreme bias: An improved optimization approach in forecast combination

被引:0
|
作者
Zhang Y. [1 ,2 ]
Cheng S. [1 ,2 ]
Han Y. [3 ]
Wang J. [1 ,2 ]
Wang S. [1 ,2 ]
机构
[1] Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing
[2] University of Chinese Academy of Sciences, Beijing
[3] China Energy Technology and Economics Research Institute, Beijing
基金
中国国家自然科学基金;
关键词
bias trade-off; extreme bias; forecast combination; particle swarm optimization;
D O I
10.12011/SETP2022-0533
中图分类号
学科分类号
摘要
Forecast combination is an important branch of ensemble learning and it is also an effective tool to improve the forecasting accuracy. The global bias estimation method of the forecaster has a mature application in various aspects. For instance, ensemble model’s in-sample mean square error is a main objective of traditional weight optimization approaches in forecast combination. However, due to the risk of “overfitting”, the minimization of the training error does not necessarily imply a corresponding minimization of the generalization error. Therefore, to increase diversity of optimization objectives, reduce risk of weight overfitting, and control tail error of ensemble model, this paper defines a metric to capture extreme bias of combined forecaster. Furthermore, a novel objective function is proposed which can make a trade-off between the global and extreme bias to achieve optimal weights. Specifically, the particle swarm optimization method is introduced to achieve the optimal combination weights. The experimental results on gold price and oil price data demonstrate that the proposed forecast combination approach can efficiently reduce overfitting risk and improve the generalization ability, outperforming simple averaging, optimal weight method and other benchmark models. © 2023 Systems Engineering Society of China. All rights reserved.
引用
收藏
页码:1837 / 1851
页数:14
相关论文
共 41 条
  • [1] Zhou Z H., Ensemble methods: Foundations and algorithms, (2020)
  • [2] Kang Y, Cao W, Petropoulos F, Et al., Forecast with forecasts: Diversity matters[J], European Journal of Operational Research, 301, 1, pp. 180-190, (2022)
  • [3] Lamberson P J, Page S E., Optimal forecasting groups[J], Management Science, 58, 4, pp. 805-810, (2012)
  • [4] Atiya A F., Why does forecast combination work so well?[J], International Journal of Forecasting, 36, 1, pp. 197-200, (2020)
  • [5] Ganaie M A, Hu M, Malik A K, Et al., Ensemble deep learning: A review, Engineering Applications of Artificial Intelligence, 115, (2022)
  • [6] Aiolfi M, Timmermann A., Persistence in forecasting performance and conditional combination strategies[J], Journal of Econometrics, 135, 1–2, pp. 31-53, (2006)
  • [7] Breiman L., Statistical modeling: The two cultures[J], Statistical Science, 16, 3, pp. 199-215, (2001)
  • [8] Makridakis S, Spiliotis E, Assimakopoulos V., The M4 Competition: Results, findings, conclusion and way forward[J], International Journal of Forecasting, 34, 4, pp. 802-808, (2018)
  • [9] Bojer C S, Meldgaard J P., Kaggle forecasting competitions: An overlooked learning opportunity[J], International Journal of Forecasting, 37, 2, pp. 587-603, (2021)
  • [10] Kourentzes N, Barrow D, Petropoulos F., Another look at forecast selection and combination: Evidence from forecast pooling[J], International Journal of Production Economics, 209, pp. 226-235, (2018)