Assembly Output Codes for Learning Neural Networks

被引:0
|
作者
Tigreat, Philippe [1 ]
Lassance, Carlos Rosar Kos [1 ]
Jiang, Xiaoran [2 ]
Gripon, Vincent [1 ]
Berrou, Claude [1 ]
机构
[1] Telecom Bretagne, Dept Elect, Plouzane, France
[2] INRIA Rennes, Rennes, France
关键词
Assembly coding; Clustered Clique Networks; ECOC; Deep Learning; Coding theory; Classification;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Neural network-based classifiers usually encode the class labels of input data via a completely disjoint code, i.e. a binary vector with only one bit associated with each category. We use coding theory to propose assembly codes where each element is associated with several classes, making for better target vectors. These codes emulate the combination of several classifiers, which is a well-known method to improve decision accuracy. Our experiments on data-sets such as MNIST with a multi-layer neural network show that assembly output codes, which are characterized by a higher minimum Hamming distance, result in better classification performance. These codes are also well suited to the use of clustered clique-based networks in category representation.
引用
收藏
页码:285 / 289
页数:5
相关论文
共 50 条
  • [1] Error-Correcting Output Codes with Ensemble Diversity for Robust Learning in Neural Networks
    Song, Yang
    Kang, Qiyu
    Tay, Wee Peng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9722 - 9729
  • [2] Experimental validation for N-ary error correcting output codes for ensemble learning of deep neural networks
    Zhao, Kaikai
    Matsukawa, Tetsu
    Suzuki, Einoshin
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2019, 52 (02) : 367 - 392
  • [3] Experimental validation for N-ary error correcting output codes for ensemble learning of deep neural networks
    Kaikai Zhao
    Tetsu Matsukawa
    Einoshin Suzuki
    Journal of Intelligent Information Systems, 2019, 52 : 367 - 392
  • [4] The Effects of Output Codes on Transfer Learning in a Deep Convolutional Neural Net
    Gutstein, Steven
    Stump, Ethan
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 542 - 548
  • [5] Multi-Layered Neural Networks with Learning of Output Functions
    Ma, Lixin
    Miyajima, Hiromi
    Shigei, Noritaka
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2006, 6 (3A): : 140 - 145
  • [6] Learning Neural Networks under Input-Output Specifications
    ul Abdeen, Zain
    Yin, He
    Kekatos, Vassilis
    Jin, Ming
    2022 AMERICAN CONTROL CONFERENCE, ACC, 2022, : 1515 - 1520
  • [7] Learning Deep Neural Networks for High Dimensional Output Problems
    Labbe, Benjamin
    Herault, Romain
    Chatelain, Clement
    EIGHTH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, PROCEEDINGS, 2009, : 63 - 68
  • [8] Deep Neural Networks Classification via Binary Error-Detecting Output Codes
    Klimo, Martin
    Lukac, Peter
    Tarabek, Peter
    APPLIED SCIENCES-BASEL, 2021, 11 (08):
  • [9] Robust Neural Networks Learning via a Minimization of Stochastic Output Sensitivity
    Li, Jincheng
    Ng, Wing W. Y.
    IEEE ACCESS, 2020, 8 (08) : 205455 - 205466
  • [10] Error Correcting Output Codes Improve Probability Estimation and Adversarial Robustness of Deep Neural Networks
    Verma, Gunjan
    Swami, Ananthram
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32