A tight bound on concept learning

被引:5
|
作者
Takahashi, H [1 ]
Gu, HZ [1 ]
机构
[1] Univ Electrocommun, Dept Commun & Syst Engn, Chofu, Tokyo 1828585, Japan
来源
关键词
backpropagation; generalization error; interpolation dimension; neural networks; PAC learning; sample complexity; VC dimension;
D O I
10.1109/72.728362
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A tight bound on the generalization performance of concept learning is shown by a novel approach. Unlike the existing theories, the new approach uses no assumption on large sample size as in Bayesian approach and does not consider the uniform learnability as in the VC dimension analysis, We analyze the generalization performance of some particular learning algorithm that is not necessarily well behaved, in the hope that once learning curves or sample complexity of this algorithm is obtained, it is applicable to real learning situations. The result is expressed in a dimension called Boolean interpolation dimension, and is tight in the sense that it meets the lower bound requirement of Baum and Haussler, The Boolean interpolation dimension is not greater than the number of modifiable system parameters, and definable for almost all the real-world networks such as back-propagaton networks and linear threshold multilayer networks. It is shown that the generalization error follows from a beta distribution of parameters m, the number of training examples, and d, the Boolean interpolation dimension. This implies that for large d, the learning results tend to the average-case result, known as the self-averaging properly of the learning, The bound is shown to be applicable to the practical learning algorithms that can be modeled by Gibbs algorithm with a uniform prior. The result is also extended to the case of inconsistent learning.
引用
收藏
页码:1191 / 1202
页数:12
相关论文
共 50 条
  • [1] Tight Lower Bound of Generalization Error in Ensemble Learning
    Uchida, Masato
    2014 JOINT 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 15TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2014, : 1130 - 1133
  • [2] Learning a hyperplane regressor through a tight bound on the VC dimension
    Jayadeva
    Chandra, Suresh
    Batra, Sanjit S.
    Sabharwal, Siddarth
    NEUROCOMPUTING, 2016, 171 : 1610 - 1616
  • [3] Tight bound for matching
    Yijie Han
    Journal of Combinatorial Optimization, 2012, 23 : 322 - 330
  • [4] Tight bound for matching
    Han, Yijie
    JOURNAL OF COMBINATORIAL OPTIMIZATION, 2012, 23 (03) : 322 - 330
  • [5] A tight bound for EMAC
    Pietrzak, Krzysztof
    AUTOMATA, LANGAGES AND PROGRAMMING, PT 2, 2006, 4052 : 168 - 179
  • [6] Bound tight (Sotheby)
    Woudhuysen, HR
    TLS-THE TIMES LITERARY SUPPLEMENT, 2005, (5358): : 13 - 13
  • [7] Autonomy in Foreign Language Learning and Teaching: A Culture Bound Concept
    Benaissi, Fawzia Bouhass
    ARAB WORLD ENGLISH JOURNAL, 2015, 6 (01) : 409 - 414
  • [8] A tight bound on negativity of superpositions
    K.-H. Ma
    C. S. Yu
    H. S. Song
    The European Physical Journal D, 2010, 59 : 317 - 320
  • [9] A TIGHT SPACE BOUND FOR CONSENSUS
    Zhu, Leqi
    SIAM JOURNAL ON COMPUTING, 2021, 50 (03)
  • [10] Erratum to: Tight bound for matching
    Yijie Han
    Journal of Combinatorial Optimization, 2013, 26 (2) : 412 - 414