Extreme Gradient Boosting with Squared Logistic Loss Function

被引:10
|
作者
Sharma, Nonita [1 ]
Anju [1 ]
Juneja, Akanksha [2 ]
机构
[1] Dr BR Ambedkar Natl Inst Technol Jalandhar, Jalandhar, Punjab, India
[2] Jawaharlal Nehru Univ, Delhi, India
来源
关键词
Boosting; Extreme gradient boosting; Squared logistic loss;
D O I
10.1007/978-981-13-0923-6_27
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tree boosting has empirically proven to be a highly effective and versatile approach for predictive modeling. The core argument is that tree boosting can adaptively determine the local neighborhoods of the model thereby taking the bias-variance trade-off into consideration during model fitting. Recently, a tree boosting method known as XGBoost has gained popularity by providing higher accuracy. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade-off even more carefully. In this manuscript, performance accuracy of XGBoost is further enhanced by applying a loss function named squared logistics loss (SqLL). Accuracy of the proposed algorithm, i.e., XGBoost with SqLL, is evaluated using test/train method, K-fold cross-validation, and stratified cross-validation method.
引用
收藏
页码:313 / 322
页数:10
相关论文
共 50 条
  • [21] Prediction of Cable Failures based on eXtreme Gradient Boosting
    Zhan, Huiyu
    Liu, Keyan
    Jia, Dongli
    2024 6TH ASIA ENERGY AND ELECTRICAL ENGINEERING SYMPOSIUM, AEEES 2024, 2024, : 610 - 614
  • [22] Railroad accident analysis using extreme gradient boosting
    Bridgelall, Raj
    Tolliver, Denver D.
    ACCIDENT ANALYSIS AND PREVENTION, 2021, 156
  • [23] Wind Speed Forecasting Based on Extreme Gradient Boosting
    Cai, Ren
    Xie, Sen
    Wang, Bozhong
    Yang, Ruijiang
    Xu, Daosen
    He, Yang
    IEEE ACCESS, 2020, 8 (08): : 175063 - 175069
  • [24] Bioactive Molecule Prediction Using Extreme Gradient Boosting
    Mustapha, Ismail Babajide
    Saeed, Faisal
    MOLECULES, 2016, 21 (08):
  • [25] Malware Detection Using Gradient Boosting Decision Trees with Customized Log Loss Function
    Gao, Yun
    Hasegawa, Hirokazu
    Yamaguchi, Yukiko
    Shimada, Hajime
    35TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING (ICOIN 2021), 2021, : 273 - 278
  • [26] On the Universality of the Logistic Loss Function
    Painsky, Amichai
    Wornell, Gregory
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 936 - 940
  • [27] SIMULTANEOUS ESTIMATION WITH A SQUARED ERROR LOSS FUNCTION
    VANDEUSEN, PC
    CANADIAN JOURNAL OF FOREST RESEARCH-REVUE CANADIENNE DE RECHERCHE FORESTIERE, 1988, 18 (08): : 1093 - 1096
  • [28] Fully corrective gradient boosting with squared hinge: Fast learning rates and early stopping
    Zeng, Jinshan
    Zhang, Min
    Lin, Shao-Bo
    NEURAL NETWORKS, 2022, 147 : 136 - 151
  • [29] Nuclear charge radius predictions based on eXtreme Gradient Boosting
    Li, Weifeng
    Zhang, Xiaoyan
    Fang, Jiyu
    PHYSICA SCRIPTA, 2024, 99 (04)
  • [30] Forecasting inflation rates be extreme gradient boosting with the genetic algorithm
    Li Y.-S.
    Pai P.-F.
    Lin Y.-L.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (03) : 2211 - 2220