Gradient descent algorithms for quantile regression with smooth approximation

被引:0
|
作者
Songfeng Zheng
机构
[1] Missouri State University,Department of Mathematics
关键词
Quantile regression; Gradient descent; Boosting; Variable selection;
D O I
暂无
中图分类号
学科分类号
摘要
Gradient based optimization methods often converge quickly to a local optimum. However, the check loss function used by quantile regression model is not everywhere differentiable, which prevents the gradient based optimization methods from being applicable. As such, this paper introduces a smooth function to approximate the check loss function so that the gradient based optimization methods could be employed for fitting quantile regression model. The properties of the smooth approximation are discussed. Two algorithms are proposed for minimizing the smoothed objective function. The first method directly applies gradient descent, resulting the gradient descent smooth quantile regression model; the second approach minimizes the smoothed objective function in the framework of functional gradient descent by changing the fitted model along the negative gradient direction in each iteration, which yields boosted smooth quantile regression algorithm. Extensive experiments on simulated data and real-world data show that, compared to alternative quantile regression models, the proposed smooth quantile regression algorithms can achieve higher prediction accuracy and are more efficient in removing noninformative predictors.
引用
收藏
页码:191 / 207
页数:16
相关论文
共 50 条
  • [31] Distributed pairwise algorithms with gradient descent methods
    Wang, Baobin
    Hu, Ting
    NEUROCOMPUTING, 2019, 333 : 364 - 373
  • [32] Normalized Gradient Descent for Variational Quantum Algorithms
    Suzuki, Yudai
    Yano, Hiroshi
    Raymond, Rudy
    Yamamoto, Naoki
    2021 IEEE INTERNATIONAL CONFERENCE ON QUANTUM COMPUTING AND ENGINEERING (QCE 2021) / QUANTUM WEEK 2021, 2021, : 1 - 9
  • [33] On the momentum term in gradient descent learning algorithms
    Qian, N
    NEURAL NETWORKS, 1999, 12 (01) : 145 - 151
  • [34] Sensitivity-Free Gradient Descent Algorithms
    Matei, Ion
    Zhenirovskyy, Maksym
    de Kleer, Johan
    Maxwell, John
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [35] COORDINATE DESCENT ALGORITHMS FOR LASSO PENALIZED REGRESSION
    Wu, Tong Tong
    Lange, Kenneth
    ANNALS OF APPLIED STATISTICS, 2008, 2 (01): : 224 - 244
  • [36] UNIFORM-IN-TIME WEAK ERROR ANALYSIS FOR STOCHASTIC GRADIENT DESCENT ALGORITHMS VIA DIFFUSION APPROXIMATION
    Feng, Yuanyuan
    Gao, Tingran
    Li, Lei
    Liu, Jian-Guo
    Lu, Yulong
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2020, 18 (01) : 163 - 188
  • [37] Backfitting and smooth backfitting in varying coefficient quantile regression
    Lee, Young K.
    Mammen, Enno
    Park, Byeong U.
    ECONOMETRICS JOURNAL, 2014, 17 (02): : S20 - S38
  • [38] A SMOOTH BLOCK BOOTSTRAP FOR QUANTILE REGRESSION WITH TIME SERIES
    Gregory, Karl B.
    Lahiri, Soumendra N.
    Nordman, Daniel J.
    ANNALS OF STATISTICS, 2018, 46 (03): : 1138 - 1166
  • [39] Gradient-free Stein variational gradient descent with kernel approximation
    Yan, Liang
    Zou, Xiling
    APPLIED MATHEMATICS LETTERS, 2021, 121 (121)
  • [40] Limitations of the Empirical Fisher Approximation for Natural Gradient Descent
    Kunstner, Frederik
    Balles, Lukas
    Hennig, Philipp
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32