An improved VC dimension bound for sparse polynomials

被引:1
|
作者
Schmitt, M [1 ]
机构
[1] Ruhr Univ Bochum, Fak Math, Lehrstuhl Math & Informat, D-44780 Bochum, Germany
来源
LEARNING THEORY, PROCEEDINGS | 2004年 / 3120卷
关键词
D O I
10.1007/978-3-540-27819-1_27
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that the function class consisting of k-sparse polynomials in n variables has Vapnik-Chervonenkis (VC) dimension at least nk + 1. This result supersedes the previously known lower bound via k-term monotone disjunctive normal form (DNF) formulas obtained by Littlestone (1988). Moreover, it implies that the VC dimension for k-sparse polynomials is strictly larger than the VC dimension for k-term monotone DNF. The new bound is achieved by introducing an exponential approach that employs Gaussian radial basis function (RBF) neural networks for obtaining classifications of points in terms of sparse polynomials.
引用
收藏
页码:393 / 407
页数:15
相关论文
共 50 条