Gradient descent algorithms for quantile regression with smooth approximation

被引:0
|
作者
Songfeng Zheng
机构
[1] Missouri State University,Department of Mathematics
关键词
Quantile regression; Gradient descent; Boosting; Variable selection;
D O I
暂无
中图分类号
学科分类号
摘要
Gradient based optimization methods often converge quickly to a local optimum. However, the check loss function used by quantile regression model is not everywhere differentiable, which prevents the gradient based optimization methods from being applicable. As such, this paper introduces a smooth function to approximate the check loss function so that the gradient based optimization methods could be employed for fitting quantile regression model. The properties of the smooth approximation are discussed. Two algorithms are proposed for minimizing the smoothed objective function. The first method directly applies gradient descent, resulting the gradient descent smooth quantile regression model; the second approach minimizes the smoothed objective function in the framework of functional gradient descent by changing the fitted model along the negative gradient direction in each iteration, which yields boosted smooth quantile regression algorithm. Extensive experiments on simulated data and real-world data show that, compared to alternative quantile regression models, the proposed smooth quantile regression algorithms can achieve higher prediction accuracy and are more efficient in removing noninformative predictors.
引用
收藏
页码:191 / 207
页数:16
相关论文
共 50 条
  • [41] Comparing Stochastic Gradient Descent and Mini-batch Gradient Descent Algorithms in Loan Risk Assessment
    Adigun, Abodunrin AbdulGafar
    Yinka-Banjo, Chika
    INFORMATICS AND INTELLIGENT APPLICATIONS, 2022, 1547 : 283 - 296
  • [42] Approximation Analysis of Gradient Descent Algorithm for Bipartite Ranking
    Chen, Hong
    He, Fangchao
    Pan, Zhibin
    JOURNAL OF APPLIED MATHEMATICS, 2012,
  • [43] Functional gradient descent for n-tuple regression
    Katopodis, Rafael F.
    Lima, Priscila M. V.
    Franca, Felipe M. G.
    NEUROCOMPUTING, 2022, 500 : 1016 - 1028
  • [44] On the convergence of gradient descent for robust functional linear regression
    Wang, Cheng
    Fan, Jun
    JOURNAL OF COMPLEXITY, 2024, 84
  • [45] Upper and lower approximation models in interval regression using regression quantile techniques
    Lee, H
    Tanaka, H
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 1999, 116 (03) : 653 - 666
  • [46] LEAST SQUARE REGRESSION WITH COEFFICIENT REGULARIZATION BY GRADIENT DESCENT
    Huang, Juan
    Chen, Hong
    Li, Luoqing
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2012, 10 (01)
  • [47] Gradient descent for robust kernel-based regression
    Guo, Zheng-Chu
    Hu, Ting
    Shi, Lei
    INVERSE PROBLEMS, 2018, 34 (06)
  • [48] Granular Elastic Network Regression with Stochastic Gradient Descent
    He, Linjie
    Chen, Yumin
    Zhong, Caiming
    Wu, Keshou
    MATHEMATICS, 2022, 10 (15)
  • [49] Deterministic Coordinate Descent Algorithms for Smooth Convex Optimization
    Wu, Xuyang
    Lu, Jie
    2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2017,
  • [50] Projected gradient descent algorithms for quantum state tomography
    Bolduc, Eliot
    Knee, George C.
    Gauger, Erik M.
    Leach, Jonathan
    NPJ QUANTUM INFORMATION, 2017, 3