A tight bound on concept learning

被引:5
|
作者
Takahashi, H [1 ]
Gu, HZ [1 ]
机构
[1] Univ Electrocommun, Dept Commun & Syst Engn, Chofu, Tokyo 1828585, Japan
来源
关键词
backpropagation; generalization error; interpolation dimension; neural networks; PAC learning; sample complexity; VC dimension;
D O I
10.1109/72.728362
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A tight bound on the generalization performance of concept learning is shown by a novel approach. Unlike the existing theories, the new approach uses no assumption on large sample size as in Bayesian approach and does not consider the uniform learnability as in the VC dimension analysis, We analyze the generalization performance of some particular learning algorithm that is not necessarily well behaved, in the hope that once learning curves or sample complexity of this algorithm is obtained, it is applicable to real learning situations. The result is expressed in a dimension called Boolean interpolation dimension, and is tight in the sense that it meets the lower bound requirement of Baum and Haussler, The Boolean interpolation dimension is not greater than the number of modifiable system parameters, and definable for almost all the real-world networks such as back-propagaton networks and linear threshold multilayer networks. It is shown that the generalization error follows from a beta distribution of parameters m, the number of training examples, and d, the Boolean interpolation dimension. This implies that for large d, the learning results tend to the average-case result, known as the self-averaging properly of the learning, The bound is shown to be applicable to the practical learning algorithms that can be modeled by Gibbs algorithm with a uniform prior. The result is also extended to the case of inconsistent learning.
引用
收藏
页码:1191 / 1202
页数:12
相关论文
共 50 条
  • [21] A TIGHT AMORTIZED BOUND FOR PATH REVERSAL
    GINAT, D
    SLEATOR, DD
    TARJAN, RE
    INFORMATION PROCESSING LETTERS, 1989, 31 (01) : 3 - 5
  • [22] The Shannon Lower Bound Is Asymptotically Tight
    Koch, Tobias
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (11) : 6155 - 6161
  • [23] Tight Bound on NewHope Failure Probability
    Plantard, Thomas
    Sipasseuth, Arnaud
    Susilo, Willy
    Zucca, Vincent
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2022, 10 (04) : 1955 - 1965
  • [24] A graph for which the inertia bound is not tight
    John Sinkovic
    Journal of Algebraic Combinatorics, 2018, 47 : 39 - 50
  • [25] A tight bound on the irregularity strength of graphs
    Nierhoff, T
    SIAM JOURNAL ON DISCRETE MATHEMATICS, 2000, 13 (03) : 313 - 323
  • [26] THE DETERMINANT BOUND FOR DISCREPANCY IS ALMOST TIGHT
    Matousek, Jiri
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 2013, 141 (02) : 451 - 460
  • [27] A graph for which the inertia bound is not tight
    Sinkovic, John
    JOURNAL OF ALGEBRAIC COMBINATORICS, 2018, 47 (01) : 39 - 50
  • [28] How tight is Hadamard's bound?
    Abbott, J
    Mulders, T
    EXPERIMENTAL MATHEMATICS, 2001, 10 (03) : 331 - 336
  • [29] A tight lower bound for the hardness of clutters
    Vahan Mkrtchyan
    Hovhannes Sargsyan
    Journal of Combinatorial Optimization, 2018, 35 : 21 - 25
  • [30] A Tight Bound for Testing Partition Properties
    Shapira, Asaf
    Stagni, Henrique
    PROCEEDINGS OF THE 2024 ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, SODA, 2024, : 4305 - 4320