A comparative analysis of gradient boosting algorithms

被引:0
|
作者
Candice Bentéjac
Anna Csörgő
Gonzalo Martínez-Muñoz
机构
[1] University of Bordeaux,College of Science and Technology
[2] Pázmány Péter Catholic University,Faculty of Information Technology and Bionics
[3] Universidad Autónoma de Madrid,Escuela Politéctica Superior
来源
关键词
XGBoost; LightGBM; CatBoost; Gradient boosting; Random forest; Ensembles of classifiers;
D O I
暂无
中图分类号
学科分类号
摘要
The family of gradient boosting algorithms has been recently extended with several interesting proposals (i.e. XGBoost, LightGBM and CatBoost) that focus on both speed and accuracy. XGBoost is a scalable ensemble technique that has demonstrated to be a reliable and efficient machine learning challenge solver. LightGBM is an accurate model focused on providing extremely fast training performance using selective sampling of high gradient instances. CatBoost modifies the computation of gradients to avoid the prediction shift in order to improve the accuracy of the model. This work proposes a practical analysis of how these novel variants of gradient boosting work in terms of training speed, generalization performance and hyper-parameter setup. In addition, a comprehensive comparison between XGBoost, LightGBM, CatBoost, random forests and gradient boosting has been performed using carefully tuned models as well as using their default settings. The results of this comparison indicate that CatBoost obtains the best results in generalization accuracy and AUC in the studied datasets although the differences are small. LightGBM is the fastest of all methods but not the most accurate. Finally, XGBoost places second both in accuracy and in training speed. Finally an extensive analysis of the effect of hyper-parameter tuning in XGBoost, LightGBM and CatBoost is carried out using two novel proposed tools.
引用
收藏
页码:1937 / 1967
页数:30
相关论文
共 50 条
  • [41] Atrial lead system for enhanced P-wave recording: A comparative study on optimal leads using gradient boosting and deep learning algorithms
    Venkatesh, N. Prasanna
    Kumar, R. Pradeep
    Neelapu, Bala Chakravarthy
    Pal, Kunal
    Sivaraman, J.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 97
  • [42] Stochastic gradient boosting
    Friedman, JH
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2002, 38 (04) : 367 - 378
  • [43] Accelerated gradient boosting
    Biau, G.
    Cadre, B.
    Rouviere, L.
    MACHINE LEARNING, 2019, 108 (06) : 971 - 992
  • [44] Accelerated gradient boosting
    G. Biau
    B. Cadre
    L. Rouvière
    Machine Learning, 2019, 108 : 971 - 992
  • [45] A comparative spatial analysis of flood susceptibility mapping using boosting machine learning algorithms in Rathnapura, Sri Lanka
    Kurugama, Kumudu Madhawa
    Kazama, So
    Hiraga, Yusuke
    Samarasuriya, Chaminda
    JOURNAL OF FLOOD RISK MANAGEMENT, 2024, 17 (02):
  • [46] Online Gradient Boosting
    Beygelzimer, Alina
    Hazan, Elad
    Kale, Satyen
    Luo, Haipeng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [47] Infinitesimal gradient boosting
    Dombry, Clement
    Duchamps, Jean-Jil
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2024, 170
  • [48] Regularized Gradient Boosting
    Cortes, Corinna
    Mohri, Mehryar
    Storcheus, Dmitry
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [49] Comparative Study of Electricity-Theft Detection Based on Gradient Boosting Machine
    Yan, Zhongzong
    Wen, He
    2021 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE (I2MTC 2021), 2021,
  • [50] Comparative analysis of classification algorithms
    Muhamedyev, R.
    Yakunin, K.
    Iskakov, S.
    Sainova, S.
    Abdilmanova, A.
    Kuchin, Y.
    2015 9TH INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT), 2015, : 96 - 101