AMITE: A Novel Polynomial Expansion for Analyzing Neural Network Nonlinearities

被引:1
|
作者
Sanchirico, Mauro J. [1 ,2 ]
Jiao, Xun [2 ]
Nataraj, C. [3 ]
机构
[1] Lockheed Martin Artificial Intelligence Ctr, Mt Laurel Township, NJ 08054 USA
[2] Villanova Univ, Dept Elect & Comp Engn, Villanova, PA 19085 USA
[3] Villanova Univ, Villanova Ctr Analyt Dynam Syst, Villanova, PA 19085 USA
关键词
Neural networks; Taylor series; Chebyshev approximation; Transforms; Convergence; Learning systems; Kernel; Approximation; equivalence; Fourier; neural networks; polynomial; Taylor; HARDWARE IMPLEMENTATION; ACTIVATION FUNCTIONS; CONVERGENCE;
D O I
10.1109/TNNLS.2021.3130904
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Polynomial expansions are important in the analysis of neural network nonlinearities. They have been applied thereto addressing well-known difficulties in verification, explainability, and security. Existing approaches span classical Taylor and Chebyshev methods, asymptotics, and many numerical approaches. We find that, while these have useful properties individually, such as exact error formulas, adjustable domain, and robustness to undefined derivatives, there are no approaches that provide a consistent method, yielding an expansion with all these properties. To address this, we develop an analytically modified integral transform expansion (AMITE), a novel expansion via integral transforms modified using derived criteria for convergence. We show the general expansion and then demonstrate an application for two popular activation functions: hyperbolic tangent and rectified linear units. Compared with existing expansions (i.e., Chebyshev, Taylor, and numerical) employed to this end, AMITE is the first to provide six previously mutually exclusive desired expansion properties, such as exact formulas for the coefficients and exact expansion errors. We demonstrate the effectiveness of AMITE in two case studies. First, a multivariate polynomial form is efficiently extracted from a single hidden layer black-box multilayer perceptron (MLP) to facilitate equivalence testing from noisy stimulus-response pairs. Second, a variety of feedforward neural network (FFNN) architectures having between three and seven layers are range bounded using Taylor models improved by the AMITE polynomials and error formulas. AMITE presents a new dimension of expansion methods suitable for the analysis/approximation of nonlinearities in neural networks, opening new directions and opportunities for the theoretical analysis and systematic testing of neural networks.
引用
收藏
页码:5732 / 5744
页数:13
相关论文
共 50 条
  • [1] A novel neural network model of hysteresis nonlinearities
    Tong, Zhao
    Sui, Shulin
    Du, Changhe
    ISDA 2006: SIXTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 1, 2006, : 84 - 89
  • [2] The polynomial neural network
    Das, S
    INFORMATION SCIENCES, 1995, 87 (04) : 231 - 246
  • [3] Network Analyzing by the Aid of Orbit Polynomial
    Ghorbani, Modjtaba
    Dehmer, Matthias
    SYMMETRY-BASEL, 2021, 13 (05):
  • [4] A novel soft computing technique for the shortcoming of the polynomial neural network
    Kim, D
    Huh, SH
    Seo, SJ
    Park, GT
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2004, 2 (02) : 189 - 200
  • [5] A novel message passing neural network based on neighborhood expansion
    Yanfeng Xue
    Zhen Jin
    Abeo Timothy Apasiba
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 849 - 860
  • [6] A novel message passing neural network based on neighborhood expansion
    Xue, Yanfeng
    Jin, Zhen
    Apasiba, Abeo Timothy
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (03) : 849 - 860
  • [7] Neural Network Identifiability for a Family of Sigmoidal Nonlinearities
    Vlacic, Verner
    Bolcskei, Helmut
    CONSTRUCTIVE APPROXIMATION, 2022, 55 (01) : 173 - 224
  • [8] Neural Network Identifiability for a Family of Sigmoidal Nonlinearities
    Verner Vlačić
    Helmut Bölcskei
    Constructive Approximation, 2022, 55 : 173 - 224
  • [9] The novel characteristics for training Ridge Polynomial neural network based on Lagrange multiplier
    Deng, Fei
    Shen, Shikai
    He, Jun
    Yue, Weihao
    Qian, Kaiguo
    Miao, Xisong
    Xu, Peng
    Wang, Min
    ALEXANDRIA ENGINEERING JOURNAL, 2023, 67 : 93 - 103
  • [10] The wave expansion neural network
    Dept. of Elec. and Comp. Engineering, Carnegie Mellon University, Pittsburgh, PA 15213, United States
    不详
    NEUROCOMPUTING, 3 (237-258):