Learning decision trees in continuous space

被引:0
|
作者
Dombi, J. [1 ]
Zsiros, A. [1 ]
机构
[1] Department of Applied Informatics, University of Szeged, Árpád tér 2, H-6720 Szeged, Hungary
来源
| 2001年 / University of Szeged卷 / 15期
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a Gain-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measure instead of the entropy function, which comes from the measure of fuzziness using a monotone fuzzy operator. It is more natural and much simpler to compute in case of concept learning (when elements belong to only two classes: positive and negative). Second, the well-known extension of the ID3 method for handling continuous attributes (C4.5) is based on discretization of attribute values and in it the decision space is separated with axis-parallel hyperplanes. In our proposed new method (CDT) continuous attributes are handled without discretization, and arbitrary geometric figures are used for separation of decision space, like hyperplanes in general position, spheres and ellipsoids. The power of our new method is going to be demonstrated on a few examples.
引用
收藏
相关论文
共 50 条
  • [21] Learning from crowds with decision trees
    Wenjun Yang
    Chaoqun Li
    Liangxiao Jiang
    Knowledge and Information Systems, 2022, 64 : 2123 - 2140
  • [22] Learning Optimal Decision Trees with SAT
    Narodytska, Nina
    Ignatiev, Alexey
    Pereira, Filipe
    Marques-Silva, Joao
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 1362 - 1368
  • [23] Learning Decision Trees for Unbalanced Data
    Cieslak, David A.
    Chawla, Nitesh V.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PART I, PROCEEDINGS, 2008, 5211 : 241 - 256
  • [24] Learning probabilistic decision trees for AUC
    Zhang, H
    Su, J
    PATTERN RECOGNITION LETTERS, 2006, 27 (08) : 892 - 899
  • [25] Learning from crowds with decision trees
    Yang, Wenjun
    Li, Chaoqun
    Jiang, Liangxiao
    KNOWLEDGE AND INFORMATION SYSTEMS, 2022, 64 (08) : 2123 - 2140
  • [26] STOCHASTIC INDUCTION OF DECISION TREES WITH APPLICATION TO LEARNING HAAR TREES
    Alizadeh, Azar
    Singhal, Mukesh
    Behzadan, Vahid
    Tavallali, Pooya
    Ranganath, Aditya
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 825 - 830
  • [27] Quality Diversity Evolutionary Learning of Decision Trees
    Ferigo, Andrea
    Custode, Leonardo Lucio
    Iacca, Giovanni
    38TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2023, 2023, : 425 - 432
  • [28] Learning Binary Decision Trees by Argmin Differentiation
    Zantedeschi, Valentina
    Kusner, Matt J.
    Niculae, Vlad
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [29] Geometric Heuristics for Transfer Learning in Decision Trees
    Chaubal, Siddhesh
    Rzepecki, Mateusz
    Nicholson, Patrick K.
    Piao, Guangyuan
    Sala, Alessandra
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 151 - 160
  • [30] Learning Decision Trees Recurrently Through Communication
    Alaniz, Stephan
    Marcos, Diego
    Schiele, Bernt
    Akata, Zeynep
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 13513 - 13522