On algorithms for ordinary least squares regression spline fitting: A comparative study

被引:38
|
作者
Lee, TCM [1 ]
机构
[1] Colorado State Univ, Dept Stat, Ft Collins, CO 80523 USA
关键词
bivariate smoothing; generalized cross-validation; genetic algorithms; regression spline; stepwise selection;
D O I
10.1080/00949650213743
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Regression spline smoothing is a popular approach for conducting nonparametric regression. An important issue associated with it is the choice of a "theoretically best" set of knots. Different statistical model selection methods, such as Akaike's information criterion and generalized cross-validation, have been applied to derive different "theoretically best" sets of knots. Typically these best knot sets are defined implicitly as the optimizers of some objective functions. Hence another equally important issue concerning regression spline smoothing is how to optimize such objective functions. In this article different numerical algorithms that are designed for carrying out such optimization problems are compared by means of a simulation study. Both the univariate and bivariate smoothing settings will be considered. Based on the simulation results, recommendations for choosing a suitable optimization algorithm under various settings will be provided.
引用
收藏
页码:647 / 663
页数:17
相关论文
共 50 条