BIAS AND VARIANCE REDUCTION PROCEDURES IN NON-PARAMETRIC REGRESSION

被引:0
|
作者
Cockeran, Marike [1 ]
Swanepoel, Cornelia J. [1 ]
机构
[1] North West Univ, Potchefstroom, South Africa
关键词
Bagging; Bandwidth; Boosting; Bragging; Cross-validation; Kernel estimators; Non-parametric; Regression;
D O I
暂无
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The purpose of this study is to determine the effect of three improvement methods on nonparametric kernel regression estimators. The improvement methods are applied to the Nadaraya-Watson estimator with cross-validation bandwidth selection, the Nadaraya-Watson estimator with plug-in bandwidth selection, the local linear estimator with plug-in bandwidth selection and a bias corrected nonparametric estimator proposed by Yao (2012), based on cross-validation bandwith selection. The performance of the different resulting estimators are evaluated by empirically calculating their mean integrated squared error (MISE), a global discrepancy measure. The first two improvement methods proposed in this study are based on bootstrap bagging and bootstrap bragging procedures, which were originally introduced and studied by Swanepoel (1988, 1990), and hereafter applied, e.g., by Breiman (1996) in machine learning. Bagging and bragging are primarily variance reduction tools. The third improvement method, referred to as boosting, aims to reduce the bias of an estimator and is based on a procedure originally proposed by Tukey (1977). The behaviour of the classical Nadaraya-Watson estimator with plug-in estimator turns out to be a new recommendable nonparametric regression estimator, since it is not only as precise and accurate as any of the other estimators, but it is also computationally much faster than any other nonparametric regression estimator considered in this study.
引用
收藏
页码:123 / 148
页数:26
相关论文
共 50 条
  • [21] NON-PARAMETRIC ESTIMATION OF A REGRESSION FUNCION
    SCHUSTER, EF
    ANNALS OF MATHEMATICAL STATISTICS, 1968, 39 (02): : 695 - +
  • [22] Parametrically guided non-parametric regression
    Glad, IK
    SCANDINAVIAN JOURNAL OF STATISTICS, 1998, 25 (04) : 649 - 668
  • [23] Non-parametric regression with wavelet kernels
    Rakotomamonjy, A
    Mary, X
    Canu, S
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2005, 21 (02) : 153 - 163
  • [24] Non-parametric regression for compositional data
    Di Marzio, Marco
    Panzera, Agnese
    Venieri, Catia
    STATISTICAL MODELLING, 2015, 15 (02) : 113 - 133
  • [25] GENERALIZED NON-PARAMETRIC ANALYSIS OF VARIANCE PROGRAM
    ROBERGE, JJ
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 1972, 25 (MAY): : 128 - &
  • [26] Testing for additivity in non-parametric regression
    Zhang, Yichi
    Staicu, Ana-Maria
    Maity, Arnab
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2016, 44 (04): : 445 - 462
  • [27] Imitation Learning with Non-Parametric Regression
    Vaandrager, Maarten
    Babuska, Robert
    Busoniu, Lucian
    Lopes, Gabriel A. D.
    2012 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION, QUALITY AND TESTING, ROBOTICS, THETA 18TH EDITION, 2012, : 91 - 96
  • [28] Intensive comparison of semi-parametric and non-parametric dimension reduction methods in forward regression
    Shin, Minju
    Yoo, Jae Keun
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2022, 29 (05) : 615 - 627
  • [29] Selection of variables and dimension reduction in high-dimensional non-parametric regression
    Bertin, Karine
    Lecue, Guillaume
    ELECTRONIC JOURNAL OF STATISTICS, 2008, 2 : 1224 - 1241
  • [30] A Bias Bound Approach to Non-parametric Inference
    Schennach, Susanne M.
    REVIEW OF ECONOMIC STUDIES, 2020, 87 (05): : 2439 - 2472