UNBIASED ESTIMATE OF GENERALIZATION ERROR AND MODEL SELECTION IN NEURAL-NETWORK

被引:24
|
作者
LIU, Y [1 ]
机构
[1] BROWN UNIV, PROVIDENCE, RI 02912 USA
基金
美国国家科学基金会;
关键词
ASYMPTOTICS; CROSS-VALIDATION; GENERALIZATION ERROR; JACKKNIFE ESTIMATOR; KULLBACK-LEIBLER MEASURE; MODEL SELECTION;
D O I
10.1016/0893-6080(94)00089-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Model selection is based upon the generalization errors of the models in consideration. To estimate the generalization error of a model from the training data, the method of cross-validation and the asymptotic form of the jackknife estimator are used. The average of the predictive errors is used to estimate the generalization error. This estimate is also used as the model selection criterion. The asymptotic form of this estimate is obtained. Asymptotic model selection criterion is also provided for the case when the error function is the penalized negative log-likelihood. In the regression case, it also proves the asymptotic equivalence of Moody's model selection criterion and the cross-validation method under a condition on the error function.
引用
收藏
页码:215 / 219
页数:5
相关论文
共 50 条