Monotonic and dual monotonic language learning

被引:21
|
作者
Lange, S
Zeugmann, T
Kapur, S
机构
[1] KYUSHU UNIV 33, RES INST FUNDAMENTAL INFORMAT SCI, FUKUOKA 812, JAPAN
[2] HTWK LEIPZIG, FB MATH & INFORMAT, D-04251 LEIPZIG, GERMANY
[3] JAMES COOK UNIV N QUEENSLAND, DEPT COMP SCI, TOWNSVILLE, QLD 4811, AUSTRALIA
关键词
D O I
10.1016/0304-3975(95)00284-7
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Monotonic and dual monotonic language learning from positive as well as from positive and negative examples is investigated. Three different notions of monotonicity are considered. Each of them reflects an alternative formalization of the requirement that the learner has to produce better and better generalizations when fed more and more data on the concept to be learned. Strong-monotonicity absolutely requires that only better and better generalizations be produced. Monotonic learning reflects the demand that for any two guesses the one output later has to be, with respect to the target language, at least as good as the earlier one. Weak-monotonicity is the analogue in learning theory of cumulativity. The corresponding three versions of dual monotonicity describe the requirement that the inference device only produces specializations that fit the target language better and better. Dual strong-monotonic learning generates a chain of shrinking specializations converging to the target language. Dual monotonicity describes the same goal with respect to the target language and dual weak-monotonic learning is the analogue of the dual of cumulativity. The power of each of these types of monotonic and dual monotonic inference from positive as well as from positive and negative data in the context of algorithmic language learning theory is completely investigated, thereby obtaining strong hierarchies.
引用
收藏
页码:365 / 410
页数:46
相关论文
共 50 条
  • [31] Dissipation in monotonic and non-monotonic relaxation to equilibrium
    Petersen, Charlotte F.
    Evans, Denis J.
    Williams, Stephen R.
    JOURNAL OF CHEMICAL PHYSICS, 2016, 144 (07):
  • [32] MONOTONIC FUZZY SYSTEMS AS UNIVERSAL APPROXIMATORS FOR MONOTONIC FUNCTIONS
    Kim, Jinwook
    Won, Jin-Myung
    Koo, Kyungmo
    Lee, Jin S.
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2012, 18 (01): : 13 - 31
  • [33] MoNGEL: monotonic nested generalized exemplar learning
    Garcia, Javier
    Fardoun, Habib M.
    Alghazzawi, Daniyal M.
    Cano, Jose-Ramon
    Garcia, Salvador
    PATTERN ANALYSIS AND APPLICATIONS, 2017, 20 (02) : 441 - 452
  • [34] MoNGEL: monotonic nested generalized exemplar learning
    Javier García
    Habib M. Fardoun
    Daniyal M. Alghazzawi
    José-Ramón Cano
    Salvador García
    Pattern Analysis and Applications, 2017, 20 : 441 - 452
  • [35] MONOTONIC SUBSEQUENCES
    KRUSKAL, JB
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 1953, 4 (02) : 264 - 274
  • [36] Learning non-monotonic logic programs: Learning exceptions
    Dimopoulos, Y
    Kakas, A
    MACHINE LEARNING: ECML-95, 1995, 912 : 122 - 137
  • [37] Monotonic Stabilization
    Yamauchi, Yukiko
    Tixeuil, Sebastien
    PRINCIPLES OF DISTRIBUTED SYSTEMS, 2010, 6490 : 475 - +
  • [38] Monotonic networks
    Sill, J
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 10, 1998, 10 : 661 - 667
  • [39] Monotonic cocycles
    Avila, Artur
    Krikorian, Raphael
    INVENTIONES MATHEMATICAE, 2015, 202 (01) : 271 - 331
  • [40] Monotonic tension
    Australian and New Zealand Physicist, 1993, 30 (08):