Training and retraining of neural network trees

被引:0
|
作者
Zhao, Q [1 ]
机构
[1] Univ Aizu, Aizu Wakamatsu 9658580, Japan
来源
IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS | 2001年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In machine learning, symbolic approaches usually yield comprehensible results without free parameters for further (incremental) retraining, On the other hand, non-symbolic (connectionist or neural network based) approaches usually yield black-boxes which are difficult to understand and reuse. The goal of this study is to propose a machine learner that is both incrementally retrainable and comprehensible through integration of decision trees and neural networks. In this paper, we introduce a kind of neural network trees (NNTrees), propose algorithms for their training and retraining, and verify the efficiency of the algorithms through experiments with a digit recognition problem.
引用
收藏
页码:726 / 731
页数:6
相关论文
共 50 条
  • [31] TRAINING AND RETRAINING: IMPACT ON PERITONITIS
    Bernardini, Judith
    PERITONEAL DIALYSIS INTERNATIONAL, 2010, 30 (04): : 434 - 436
  • [32] TRAINING AND RETRAINING OF MANAGERIAL PERSONNEL
    MORALEV, B
    SOVIET EDUCATION, 1972, 15 (01): : 10 - 22
  • [33] A GROWTH ALGORITHM FOR NEURAL NETWORK DECISION TREES
    GOLEA, M
    MARCHAND, M
    EUROPHYSICS LETTERS, 1990, 12 (03): : 205 - 210
  • [34] Improving Neural Network Quantization without Retraining using Outlier Channel Splitting
    Zhao, Ritchie
    Hu, Yuwei
    Dotzel, Jordan
    De Sa, Christopher
    Zhang, Zhiru
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [35] Deep Neural Network Initialization With Decision Trees
    Humbird, Kelli D.
    Peterson, J. Luc
    McClarren, Ryan G.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (05) : 1286 - 1295
  • [36] Growing neural network trees efficiently and effectively
    Takeda, T
    Zhao, QF
    DESIGN AND APPLICATION OF HYBRID INTELLIGENT SYSTEMS, 2003, 104 : 107 - 115
  • [37] Efficiently Combining SVD, Pruning, Clustering and Retraining for Enhanced Neural Network Compression
    Goetschalckx, Koen
    Moons, Bert
    Wambacq, Patrick
    Verhelst, Marian
    PROCEEDINGS OF THE 2018 INTERNATIONAL WORKSHOP ON EMBEDDED AND MOBILE DEEP LEARNING (EMDL '18), 2018, : 1 - 6
  • [38] Global Optimality in Neural Network TrainingGlobal Optimality in Neural Network Training
    Haeffele, Benjamin D.
    Vidal, Rene
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 4390 - 4398
  • [39] Adaptive on-line neural network retraining for real life multimodal emotion recognition
    Ioannou, Spiros
    Kessous, Loic
    Caridakis, George
    Karpouzis, Kostas
    Aharonson, Vered
    Kollias, Stefanos
    ARTIFICIAL NEURAL NETWORKS - ICANN 2006, PT 1, 2006, 4131 : 81 - 92
  • [40] Tikhonov training of the CMAC neural network
    Weruaga, Luis
    Kieslinger, Barbara
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (03): : 613 - 622