Sample data selection method for neural network classifiers

被引:0
|
作者
Zhou, Yu [1 ,2 ]
Zhu, Anfu [1 ]
Zhou, Lin [3 ]
Qian, Xu [2 ]
机构
[1] School of Electric Power, North China University of Water Resources and Electric Power, Zhengzhou 450011, China
[2] School of Mechanical Electronic and Information Engineering, China University of Mining and Technology, Beijing 100083, China
[3] Research Institute of Information Technology, Tsinghua University, Beijing 100084, China
关键词
Classification (of information) - Data reduction;
D O I
暂无
中图分类号
学科分类号
摘要
In order to improve the performance of neural network classifiers (NNCs), a novel sample data selection method based on shadowed sets was proposed. On the basis of shadowed sets, core data and boundary data were established. First, the optimal fuzzy matrix of sample data was acquired by using FCM. Then, corresponding shadowed sets were induced. On the foundation of sample data and shadowed sets, core data and boundary data could be formed. Finally, the sample data of NNCs could be selected effectively from core data and boundary data. Applying this method and Iris data, experiments for BP neural network, LVQ neural network and extension neural network (ENN) are conducted. Experimental results show that the proposed method can keep typical sample data and reduce the number of training sample data. And with selected sample to train NNCs data can save training time, guarantee generalization ability, and effectively achieve a better performance.
引用
收藏
页码:39 / 43
相关论文
共 50 条
  • [41] Design of robust neural network classifiers
    Larsen, J
    Andersen, LN
    Hintz-Madsen, M
    Hansen, LK
    PROCEEDINGS OF THE 1998 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-6, 1998, : 1205 - 1208
  • [42] Assessing the robustness of neural network classifiers
    Marin, JA
    Ray, CK
    Brockhaus, J
    Klingseisen, R
    1998 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5, 1998, : 4258 - 4263
  • [43] Assessing the robustness of neural network classifiers
    Marin, JA
    Ray, CK
    Brockhaus, J
    Klingseisen, R
    1998 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5, 1998, : 4257 - 4257
  • [44] Neural network initialization by combined classifiers
    van Breukelen, M
    Duin, RPW
    FOURTEENTH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1 AND 2, 1998, : 215 - 218
  • [45] Impact of Variability in Data on Accuracy and Diversity of Neural Network Based Ensemble Classifiers
    Chiu, Chien-Yuan
    Verma, Brijesh
    Li, Michael
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [46] Committee-based sample selection for probabilistic classifiers
    Argamon-Engelson, S
    Dagan, I
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 1999, 11 : 335 - 360
  • [47] Committee-Based Sample Selection for Probabilistic Classifiers
    Argamon-Engelson, Shlomo
    Dagan, Ido
    Journal of Artificial Intelligence Research, 11 (00): : 335 - 360
  • [48] Effect of Training Data Volume on Performance of Convolutional Neural Network Pneumothorax Classifiers
    Thian, Yee Liang
    Ng, Dian Wen
    Hallinan, James Thomas Patrick Decourcy
    Jagmohan, Pooja
    Sia, Soon Yiew
    Mohamed, Jalila Sayed Adnan
    Quek, Swee Tian
    Feng, Mengling
    JOURNAL OF DIGITAL IMAGING, 2022, 35 (04) : 881 - 892
  • [49] Effect of Training Data Volume on Performance of Convolutional Neural Network Pneumothorax Classifiers
    Yee Liang Thian
    Dian Wen Ng
    James Thomas Patrick Decourcy Hallinan
    Pooja Jagmohan
    Soon Yiew Sia
    Jalila Sayed Adnan Mohamed
    Swee Tian Quek
    Mengling Feng
    Journal of Digital Imaging, 2022, 35 : 881 - 892
  • [50] Data Correction For Enhancing Classification Accuracy By Unknown Deep Neural Network Classifiers
    Kwon, Hyun
    Yoon, Hyunsoo
    Choi, Daeseon
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2021, 15 (09): : 3243 - 3257