Improving hyper-parameter self-tuning for data streams by adapting an evolutionary approach

被引:1
|
作者
Moya, Antonio R. [1 ]
Veloso, Bruno [2 ,3 ]
Gama, Joao [2 ,3 ]
Ventura, Sebastian [1 ]
机构
[1] Univ Cordoba, Andalusian Res Inst Data Sci & Computat Intelligen, Dept Comp Sci & Numer Anal, Cordoba, Spain
[2] INESC TEC, Porto, Portugal
[3] Univ Porto, FEP, Porto, Portugal
关键词
Data streams; Concept drift; Optimisation; Hyper-parameters; Evolutionary algorithms; ALGORITHMS;
D O I
10.1007/s10618-023-00997-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyper-parameter tuning of machine learning models has become a crucial task in achieving optimal results in terms of performance. Several researchers have explored the optimisation task during the last decades to reach a state-of-the-art method. However, most of them focus on batch or offline learning, where data distributions do not change arbitrarily over time. On the other hand, dealing with data streams and online learning is a challenging problem. In fact, the higher the technology goes, the greater the importance of sophisticated techniques to process these data streams. Thus, improving hyper-parameter self-tuning during online learning of these machine learning models is crucial. To this end, in this paper, we present MESSPT, an evolutionary algorithm for self-hyper-parameter tuning for data streams. We apply Differential Evolution to dynamically-sized samples, requiring a single pass-over of data to train and evaluate models and choose the best configurations. We take care of the number of configurations to be evaluated, which necessarily has to be reduced, thus making this evolutionary approach a micro-evolutionary one. Furthermore, we control how our evolutionary algorithm deals with concept drift. Experiments on different learning tasks and over well-known datasets show that our proposed MESSPT outperforms the state-of-the-art on hyper-parameter tuning for data streams.
引用
收藏
页码:1289 / 1315
页数:27
相关论文
共 50 条
  • [31] Automatic Hyper-Parameter Tuning for Black-box LiDAR Odometry
    Koide, Kenji
    Yokozuka, Masashi
    Oishi, Shuji
    Banno, Atsuhiko
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 5069 - 5074
  • [32] Hyper-parameter Tuning using Genetic Algorithms for Software Effort Estimation
    Villalobos-Arias, Leonardo
    Quesada-Lopez, Christian
    Jenkins, Marcelo
    Murillo-Morera, Juan
    PROCEEDINGS OF 2021 16TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI'2021), 2021,
  • [33] Towards Self-Tuning Parameter Servers
    Liu, Chris
    Zhang, Pengfei
    Tang, Bo
    Shen, Hang
    Lai, Ziliang
    Lo, Eric
    Chung, Korris
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 310 - 319
  • [34] Hyper-Parameter Tuning for Graph Kernels via Multiple Kernel Learning
    Massimo, Carlo M.
    Navarin, Nicolo
    Sperduti, Alessandro
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 214 - 223
  • [35] Lessons learned from hyper-parameter tuning for microservice candidate identification
    Yedida, Rahul
    Krishna, Rahul
    Kalia, Anup
    Menzies, Tim
    Xiao, Jin
    Vukovic, Maja
    2021 36TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING ASE 2021, 2021, : 1141 - 1145
  • [36] Facilitating Database Tuning with Hyper-Parameter Optimization: A Comprehensive Experimental Evaluation
    Zhang, Xinyi
    Chang, Zhuo
    Li, Yang
    Wu, Hong
    Tan, Jian
    Li, Feifei
    Cui, Bin
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (09): : 1808 - 1821
  • [37] Optimal Hyper-Parameter Tuning of SVM Classifiers With Application to Medical Diagnosis
    Rojas-Dominguez, Alfonso
    Carlos Padierna, Luis
    Carpio Valadez, Juan Martin
    Puga-Soberanes, Hector J.
    Fraire, Hector J.
    IEEE ACCESS, 2018, 6 : 7164 - 7176
  • [38] A Cost-Effective Approach for Hyper-Parameter Tuning in Search-based Test Case Generation
    Zamani, Shayan
    Hemmati, Hadi
    2020 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE MAINTENANCE AND EVOLUTION (ICSME 2020), 2020, : 418 - 429
  • [39] Comparative Study of Random Search Hyper-Parameter Tuning for Software Effort Estimation
    Villalobos-Arias, Leonardo
    Quesada-Lopez, Christian
    PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON PREDICTIVE MODELS AND DATA ANALYTICS IN SOFTWARE ENGINEERING (PROMISE '21), 2021, : 21 - 29
  • [40] Hyper-parameter optimization for improving the performance of localization in an iterative ensemble smoother
    Luo, Xiaodong
    Cruz, William C.
    Zhang, Xin-Lei
    Xiao, Heng
    GEOENERGY SCIENCE AND ENGINEERING, 2023, 231