Gradient descent algorithms for quantile regression with smooth approximation

被引:0
|
作者
Songfeng Zheng
机构
[1] Missouri State University,Department of Mathematics
关键词
Quantile regression; Gradient descent; Boosting; Variable selection;
D O I
暂无
中图分类号
学科分类号
摘要
Gradient based optimization methods often converge quickly to a local optimum. However, the check loss function used by quantile regression model is not everywhere differentiable, which prevents the gradient based optimization methods from being applicable. As such, this paper introduces a smooth function to approximate the check loss function so that the gradient based optimization methods could be employed for fitting quantile regression model. The properties of the smooth approximation are discussed. Two algorithms are proposed for minimizing the smoothed objective function. The first method directly applies gradient descent, resulting the gradient descent smooth quantile regression model; the second approach minimizes the smoothed objective function in the framework of functional gradient descent by changing the fitted model along the negative gradient direction in each iteration, which yields boosted smooth quantile regression algorithm. Extensive experiments on simulated data and real-world data show that, compared to alternative quantile regression models, the proposed smooth quantile regression algorithms can achieve higher prediction accuracy and are more efficient in removing noninformative predictors.
引用
收藏
页码:191 / 207
页数:16
相关论文
共 50 条
  • [1] Gradient descent algorithms for quantile regression with smooth approximation
    Zheng, Songfeng
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2011, 2 (03) : 191 - 207
  • [2] A coordinate descent algorithm for computing penalized smooth quantile regression
    Mkhadri, Abdallah
    Ouhourane, Mohamed
    Oualkacha, Karim
    STATISTICS AND COMPUTING, 2017, 27 (04) : 865 - 883
  • [3] A coordinate descent algorithm for computing penalized smooth quantile regression
    Abdallah Mkhadri
    Mohamed Ouhourane
    Karim Oualkacha
    Statistics and Computing, 2017, 27 : 865 - 883
  • [4] Approximation Analysis of Learning Algorithms for Support Vector Regression and Quantile Regression
    Xiang, Dao-Hong
    Hu, Ting
    Zhou, Ding-Xuan
    JOURNAL OF APPLIED MATHEMATICS, 2012,
  • [5] Block-wise Descent Algorithms for Group Variable-Selection in Quantile Regression
    Oualkacha, Karim
    Ouhourane, Mohamed
    Yang, Yi
    Greenwood, Celia M. T.
    GENETIC EPIDEMIOLOGY, 2017, 41 (07) : 700 - 700
  • [6] Boosting algorithms as gradient descent
    Mason, L
    Baxter, O
    Bartlett, P
    Frean, M
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 12, 2000, 12 : 512 - 518
  • [7] Advanced algorithms for penalized quantile and composite quantile regression
    Pietrosanu, Matthew
    Gao, Jueyu
    Kong, Linglong
    Jiang, Bei
    Niu, Di
    COMPUTATIONAL STATISTICS, 2021, 36 (01) : 333 - 346
  • [8] Advanced algorithms for penalized quantile and composite quantile regression
    Matthew Pietrosanu
    Jueyu Gao
    Linglong Kong
    Bei Jiang
    Di Niu
    Computational Statistics, 2021, 36 : 333 - 346
  • [9] SMOOTH APPROXIMATION OF THE QUANTILE FUNCTION DERIVATIVES
    Sobol, V. R.
    Torishnyy, R. O.
    BULLETIN OF THE SOUTH URAL STATE UNIVERSITY SERIES-MATHEMATICAL MODELLING PROGRAMMING & COMPUTER SOFTWARE, 2022, 15 (04): : 115 - 122
  • [10] SMOOTH DENSITY SPATIAL QUANTILE REGRESSION
    Brantley, Halley
    Fuentes, Montserrat
    Guinness, Joseph
    Thoma, Eben
    STATISTICA SINICA, 2021, 31 (03) : 1167 - 1187