Gradient descent algorithms for quantile regression with smooth approximation

被引:0
|
作者
Songfeng Zheng
机构
[1] Missouri State University,Department of Mathematics
关键词
Quantile regression; Gradient descent; Boosting; Variable selection;
D O I
暂无
中图分类号
学科分类号
摘要
Gradient based optimization methods often converge quickly to a local optimum. However, the check loss function used by quantile regression model is not everywhere differentiable, which prevents the gradient based optimization methods from being applicable. As such, this paper introduces a smooth function to approximate the check loss function so that the gradient based optimization methods could be employed for fitting quantile regression model. The properties of the smooth approximation are discussed. Two algorithms are proposed for minimizing the smoothed objective function. The first method directly applies gradient descent, resulting the gradient descent smooth quantile regression model; the second approach minimizes the smoothed objective function in the framework of functional gradient descent by changing the fitted model along the negative gradient direction in each iteration, which yields boosted smooth quantile regression algorithm. Extensive experiments on simulated data and real-world data show that, compared to alternative quantile regression models, the proposed smooth quantile regression algorithms can achieve higher prediction accuracy and are more efficient in removing noninformative predictors.
引用
收藏
页码:191 / 207
页数:16
相关论文
共 50 条
  • [21] Online gradient descent learning algorithms
    Ying, Yiming
    Pontil, Massimiliano
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2008, 8 (05) : 561 - 596
  • [22] Optimal Smooth Approximation for Quantile Matrix Factorization
    Liu, Peng
    Liu, Yi
    Zhu, Rui
    Kong, Linglong
    Jiang, Bei
    Niu, Di
    PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 595 - 603
  • [23] Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization
    Gong, Chengyue
    Peng, Jian
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [24] Projective Approximation Based Gradient Descent Modification
    Senov, Alexander
    Granichin, Oleg
    IFAC PAPERSONLINE, 2017, 50 (01): : 3899 - 3904
  • [25] Efficient importance sampling imputation algorithms for quantile and composite quantile regression
    Cheng, Hao
    STATISTICAL ANALYSIS AND DATA MINING, 2022, 15 (03) : 339 - 356
  • [26] On the diffusion approximation of nonconvex stochastic gradient descent
    Hu, Wenqing
    Li, Chris Junchi
    Li, Lei
    Liu, Jian-Guo
    ANNALS OF MATHEMATICAL SCIENCES AND APPLICATIONS, 2019, 4 (01) : 3 - 32
  • [27] Smooth momentum: improving lipschitzness in gradient descent
    Kim, Bum Jun
    Choi, Hyeyeon
    Jang, Hyeonah
    Kim, Sang Woo
    APPLIED INTELLIGENCE, 2023, 53 (11) : 14233 - 14248
  • [28] Smooth momentum: improving lipschitzness in gradient descent
    Bum Jun Kim
    Hyeyeon Choi
    Hyeonah Jang
    Sang Woo Kim
    Applied Intelligence, 2023, 53 : 14233 - 14248
  • [29] Stochastic Gradient Descent Meets Distribution Regression
    Muecke, Nicole
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [30] Convergence analysis of gradient descent stochastic algorithms
    Shapiro, A
    Wardi, Y
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1996, 91 (02) : 439 - 454