Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters

被引:7
|
作者
Wang, Yanxin [1 ]
Zhu, Li [2 ]
机构
[1] Ningbo Univ Technol, Sch Sci, Ningbo 315211, Zhejiang, Peoples R China
[2] Xiamen Univ Technol, Sch Appl Math, Xiamen 361024, Peoples R China
基金
中国国家自然科学基金;
关键词
WLAD-SCAD; Robust regularization; Oracle property; Variable selection; NONCONCAVE PENALIZED LIKELIHOOD; ABSOLUTE DEVIATION METHOD; QUANTILE REGRESSION; MODEL SELECTION; ROBUST REGRESSION; LINEAR-REGRESSION; ORACLE PROPERTIES; LASSO; SHRINKAGE; APPROXIMATION;
D O I
10.1016/j.jkss.2016.12.003
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we focus on the variable selection based on the weighted least absolute deviation (WLAD) regression with the diverging number of parameters. The WLAD estimator and the smoothly clipped absolute deviation (SCAD) are combined to achieve robust parameter estimation and variable selection in regression simultaneously. Compared with the LAD-SCAD method, the WLAD-SCAD method will resist the heavy-tailed errors and outliers in explanatory variables. Furthermore, we obtain consistency and asymptotic normality of the estimators under certain appropriate conditions. Simulation studies and a real example are provided to demonstrate the superiority of the WLAD-SCAD method over the other regularization methods in the presence of outliers in the explanatory variables and the heavy-tailed error distribution. (C) 2017 The Korean Statistical Society. Published by Elsevier B.V. All rights reserved.
引用
收藏
页码:390 / 403
页数:14
相关论文
共 50 条