Learning decision trees in continuous space

被引:0
|
作者
Dombi, J. [1 ]
Zsiros, A. [1 ]
机构
[1] Department of Applied Informatics, University of Szeged, Árpád tér 2, H-6720 Szeged, Hungary
来源
| 2001年 / University of Szeged卷 / 15期
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a Gain-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measure instead of the entropy function, which comes from the measure of fuzziness using a monotone fuzzy operator. It is more natural and much simpler to compute in case of concept learning (when elements belong to only two classes: positive and negative). Second, the well-known extension of the ID3 method for handling continuous attributes (C4.5) is based on discretization of attribute values and in it the decision space is separated with axis-parallel hyperplanes. In our proposed new method (CDT) continuous attributes are handled without discretization, and arbitrary geometric figures are used for separation of decision space, like hyperplanes in general position, spheres and ellipsoids. The power of our new method is going to be demonstrated on a few examples.
引用
收藏
相关论文
共 50 条
  • [31] LEARNING DECISION TREES USING CONFUSION ENTROPY
    Jin, Han
    Wang, Xiao-Ning
    Gao, Fei
    Li, Jian
    Wei, Jin-Mao
    PROCEEDINGS OF 2013 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOLS 1-4, 2013, : 560 - 564
  • [32] Feature Learning for Interpretable, Performant Decision Trees
    Good, Jack H.
    Kovach, Torin
    Miller, Kyle
    Dubrawski, Artur
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [33] Learning Decision Trees from Distributed Datasets
    Xie Hongxia
    Shi Liping
    Meng Fanrong
    Wang Chun
    DCABES 2008 PROCEEDINGS, VOLS I AND II, 2008, : 96 - +
  • [34] LEARNING DECISION TREES USING THE FOURIER SPECTRUM
    KUSHILEVITZ, E
    MANSOUR, Y
    SIAM JOURNAL ON COMPUTING, 1993, 22 (06) : 1331 - 1348
  • [35] Online transfer learning with multiple decision trees
    Yimin Wen
    Yixiu Qin
    Keke Qin
    Xiaoxia Lu
    Pingshan Liu
    International Journal of Machine Learning and Cybernetics, 2019, 10 : 2941 - 2962
  • [36] Convolutional Decision Trees for Feature Learning and Segmentation
    Laptev, Dmitry
    Buhmann, Joachim M.
    PATTERN RECOGNITION, GCPR 2014, 2014, 8753 : 95 - 106
  • [37] Learning from imperfect examples in decision trees
    Janikow, CZ
    COMPUTERS AND THEIR APPLICATIONS - PROCEEDINGS OF THE ISCA 11TH INTERNATIONAL CONFERENCE, 1996, : 71 - 74
  • [38] Learning decision trees with taxonomy of propositionalized attributes
    Kang, Dae-Ki
    Sohn, Kiwook
    PATTERN RECOGNITION, 2009, 42 (01) : 84 - 92
  • [39] A Geometric Algorithm for Learning Oblique Decision Trees
    Manwani, Naresh
    Sastry, P. S.
    PATTERN RECOGNITION AND MACHINE INTELLIGENCE, PROCEEDINGS, 2009, 5909 : 25 - 31
  • [40] Shattering Inequalities for Learning Optimal Decision Trees
    Boutilier, Justin J.
    Michini, Carla
    Zhou, Zachary
    INTEGRATION OF CONSTRAINT PROGRAMMING, ARTIFICIAL INTELLIGENCE, AND OPERATIONS RESEARCH, CPAIOR 2022, 2022, 13292 : 74 - 90