A PERCEPTRON-BASED FEATURE SELECTION APPROACH FOR DECISION TREE CLASSIFICATION

被引:0
|
作者
Casaroti, Carla Jaqueline [1 ]
Silva Centeno, Jorge Antonio [1 ]
Fuchs, Stephan [2 ]
机构
[1] Univ Fed Parana UFPR, Dept Geomat, Curitiba, Parana, Brazil
[2] Karlsruher Inst Technol KIT, Fachbereich Siedlungswasserwirtschaft & Wassergut, Karlsruhe, Germany
来源
BOLETIM DE CIENCIAS GEODESICAS | 2020年 / 26卷 / 03期
关键词
Feature Selection (FS); Perceptron; Decision tree; VEGETATION; IMAGERY;
D O I
10.1590/s1982-21702020000300015
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
The use of OBIA for high spatial resolution image classification can be divided in two main steps, the first being segmentation and the second regarding the labeling of the objects in accordance with a particular set of features and a classifier. Decision trees are often used to represent human knowledge in the latter. The issue falls in how to select a smaller amount of features from a feature space with spatial, spectral and textural variables to describe the classes of interest, which engenders the matter of choosing the best or more convenient feature selection (FS) method. In this work, an approach for FS within a decision tree was introduced using a single perceptron and the Backpropagation algorithm. Three alternatives were compared: single, double and multiple inputs, using a sequential backward search (SBS). Test regions were used to evaluate the efficiency of the proposed methods. Results showed that it is possible to use a single perceptron in each node, with an overall accuracy (OA) between 77.6% and 77.9%. Only SBS reached an OA larger than 88%. Thus, the quality of the proposed solution depends on the number of input features.
引用
收藏
页码:1 / 17
页数:17
相关论文
共 50 条
  • [21] Perceptron-based branch confidence estimation
    Akkary, H
    Srinivasan, ST
    Koltur, R
    Patil, Y
    Refaai, W
    10TH INTERNATIONAL SYMPOSIUM ON HIGH PERFORMANCE COMPUTER ARCHITECTURE, PROCEEDINGS, 2004, : 265 - 274
  • [22] Feature selection with decision tree criterion
    Gra¸bczewski, K. (kgrabcze@phys.uni.torun.pl), Operador Nacional do Sistema Eletrico - ONS; Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (Inst. of Elec. and Elec. Eng. Computer Society, 445 Hoes Lane - P.O.Box 1331, Piscataway, NJ 08855-1331, United States):
  • [23] Feature selection with decision tree criterion
    Grabczewski, K
    Jankowski, N
    HIS 2005: 5TH INTERNATIONAL CONFERENCE ON HYBRID INTELLIGENT SYSTEMS, PROCEEDINGS, 2005, : 212 - 217
  • [24] Perceptron-based AOW Clustering Algorithms
    Wang, Huozhu
    Yu, Jianguo
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN MECHANICAL ENGINEERING AND INDUSTRIAL INFORMATICS (AMEII 2016), 2016, 73 : 1303 - 1308
  • [25] Analysis of Perceptron-Based Active Learning
    Dasgupta, Sanjoy
    Kalai, Adam Tauman
    Monteleoni, Claire
    JOURNAL OF MACHINE LEARNING RESEARCH, 2009, 10 : 281 - 299
  • [26] Feature Selection Methods Based on Decision Rule and Tree Models
    Paja, Wieslaw
    INTELLIGENT DECISION TECHNOLOGIES 2016, PT II, 2016, 57 : 63 - 70
  • [27] SYSTEMATIC CLASSIFICATION AND IDENTIFICATION OF NOISE SPECTRA USING PERCEPTRON-BASED NEURAL NETWORKS
    RACZ, A
    KISS, S
    ANNALS OF NUCLEAR ENERGY, 1994, 21 (01) : 19 - 44
  • [28] CART Decision Tree Combined with Boruta Feature Selection for Medical Data Classification
    Tang, Rong
    Zhang, Xiaojun
    2020 5TH IEEE INTERNATIONAL CONFERENCE ON BIG DATA ANALYTICS (IEEE ICBDA 2020), 2020, : 80 - 84
  • [29] An improved tree model based on ensemble feature selection for classification
    Mohan, Chandralekha
    Nagarajan, Shenbagavadivu
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2019, 27 (02) : 1290 - 1307
  • [30] Perceptron-based confidence estimation for value prediction
    Black, M
    Franklin, M
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON INTELLIGENT SENSING AND INFORMATION PROCESSING, 2004, : 271 - 276