A tight bound on concept learning

被引:5
|
作者
Takahashi, H [1 ]
Gu, HZ [1 ]
机构
[1] Univ Electrocommun, Dept Commun & Syst Engn, Chofu, Tokyo 1828585, Japan
来源
关键词
backpropagation; generalization error; interpolation dimension; neural networks; PAC learning; sample complexity; VC dimension;
D O I
10.1109/72.728362
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A tight bound on the generalization performance of concept learning is shown by a novel approach. Unlike the existing theories, the new approach uses no assumption on large sample size as in Bayesian approach and does not consider the uniform learnability as in the VC dimension analysis, We analyze the generalization performance of some particular learning algorithm that is not necessarily well behaved, in the hope that once learning curves or sample complexity of this algorithm is obtained, it is applicable to real learning situations. The result is expressed in a dimension called Boolean interpolation dimension, and is tight in the sense that it meets the lower bound requirement of Baum and Haussler, The Boolean interpolation dimension is not greater than the number of modifiable system parameters, and definable for almost all the real-world networks such as back-propagaton networks and linear threshold multilayer networks. It is shown that the generalization error follows from a beta distribution of parameters m, the number of training examples, and d, the Boolean interpolation dimension. This implies that for large d, the learning results tend to the average-case result, known as the self-averaging properly of the learning, The bound is shown to be applicable to the practical learning algorithms that can be modeled by Gibbs algorithm with a uniform prior. The result is also extended to the case of inconsistent learning.
引用
收藏
页码:1191 / 1202
页数:12
相关论文
共 50 条
  • [31] The inertia bound is far from tight
    Kwan, Matthew
    Wigderson, Yuval
    BULLETIN OF THE LONDON MATHEMATICAL SOCIETY, 2024, 56 (10) : 3196 - 3208
  • [32] Tight Bound on Mobile Byzantine Agreement
    Bonnet, Francois
    Defago, Xavier
    Thanh Dang Nguyen
    Potop-Butucaru, Maria
    DISTRIBUTED COMPUTING (DISC 2014), 2014, 8784 : 76 - 90
  • [33] Tight upper bound on discrete entropy
    Nanyang Technological Univ, Singapore
    IEEE Trans Inf Theory, 2 (775-778):
  • [34] A Tight Lower Bound for Entropy Flattening
    Chen, Yi-Hsiu
    Goos, Mika
    Vadhan, Salil P.
    Zhang, Jiapeng
    33RD COMPUTATIONAL COMPLEXITY CONFERENCE (CCC 2018), 2018, 102
  • [35] A Tight Lower Bound for Streett Complementation
    Cai, Yang
    Zhang, Ting
    IARCS ANNUAL CONFERENCE ON FOUNDATIONS OF SOFTWARE TECHNOLOGY AND THEORETICAL COMPUTER SCIENCE (FSTTCS 2011), 2011, 13 : 339 - 350
  • [36] A TIGHT BOUND ON THE SET CHROMATIC NUMBER
    Sereni, Jean-Sebastien
    Yilma, Zelealem B.
    DISCUSSIONES MATHEMATICAE GRAPH THEORY, 2013, 33 (02) : 461 - 465
  • [37] A TIGHT BOUND FOR 3-PARTITIONING
    KELLERER, H
    WOEGINGER, G
    DISCRETE APPLIED MATHEMATICS, 1993, 45 (03) : 249 - 259
  • [38] A tight bound for approximating the square root
    Bshouty, NH
    Mansour, Y
    Schieber, B
    Tiwari, P
    INFORMATION PROCESSING LETTERS, 1997, 63 (04) : 211 - 213
  • [39] A Tight Lower Bound for Steiner Orientation
    Chitnis, Rajesh
    Feldmann, Andreas Emil
    COMPUTER SCIENCE - THEORY AND APPLICATIONS, CSR 2018, 2018, 10846 : 65 - 77
  • [40] The Meta-Converse Bound is Tight
    Vazquez-Vilar, Gonzalo
    Campo, Adria Tauste
    Fabregas, Albert Guillen I.
    Martinez, Alfonso
    2013 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2013, : 1730 - 1733