AMITE: A Novel Polynomial Expansion for Analyzing Neural Network Nonlinearities

被引:1
|
作者
Sanchirico, Mauro J. [1 ,2 ]
Jiao, Xun [2 ]
Nataraj, C. [3 ]
机构
[1] Lockheed Martin Artificial Intelligence Ctr, Mt Laurel Township, NJ 08054 USA
[2] Villanova Univ, Dept Elect & Comp Engn, Villanova, PA 19085 USA
[3] Villanova Univ, Villanova Ctr Analyt Dynam Syst, Villanova, PA 19085 USA
关键词
Neural networks; Taylor series; Chebyshev approximation; Transforms; Convergence; Learning systems; Kernel; Approximation; equivalence; Fourier; neural networks; polynomial; Taylor; HARDWARE IMPLEMENTATION; ACTIVATION FUNCTIONS; CONVERGENCE;
D O I
10.1109/TNNLS.2021.3130904
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Polynomial expansions are important in the analysis of neural network nonlinearities. They have been applied thereto addressing well-known difficulties in verification, explainability, and security. Existing approaches span classical Taylor and Chebyshev methods, asymptotics, and many numerical approaches. We find that, while these have useful properties individually, such as exact error formulas, adjustable domain, and robustness to undefined derivatives, there are no approaches that provide a consistent method, yielding an expansion with all these properties. To address this, we develop an analytically modified integral transform expansion (AMITE), a novel expansion via integral transforms modified using derived criteria for convergence. We show the general expansion and then demonstrate an application for two popular activation functions: hyperbolic tangent and rectified linear units. Compared with existing expansions (i.e., Chebyshev, Taylor, and numerical) employed to this end, AMITE is the first to provide six previously mutually exclusive desired expansion properties, such as exact formulas for the coefficients and exact expansion errors. We demonstrate the effectiveness of AMITE in two case studies. First, a multivariate polynomial form is efficiently extracted from a single hidden layer black-box multilayer perceptron (MLP) to facilitate equivalence testing from noisy stimulus-response pairs. Second, a variety of feedforward neural network (FFNN) architectures having between three and seven layers are range bounded using Taylor models improved by the AMITE polynomials and error formulas. AMITE presents a new dimension of expansion methods suitable for the analysis/approximation of nonlinearities in neural networks, opening new directions and opportunities for the theoretical analysis and systematic testing of neural networks.
引用
收藏
页码:5732 / 5744
页数:13
相关论文
共 50 条
  • [31] Analyzing state dynamics in a recurrent neural network
    Spiegel, R
    Suret, M
    Le Pelley, ME
    McLaren, IPL
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 834 - 839
  • [32] Neural network correction of nonlinearities in scanning probe microscope images
    Hadjiiski, L
    Munster, S
    Oesterschulze, E
    Kassing, R
    JOURNAL OF VACUUM SCIENCE & TECHNOLOGY B, 1996, 14 (02): : 1563 - 1568
  • [33] A METHODOLOGY FOR NEURAL-NETWORK TRAINING FOR CONTROL OF DRIVES WITH NONLINEARITIES
    LOW, TS
    LEE, TH
    LIM, HK
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 1993, 40 (02) : 243 - 249
  • [34] Adaptive RBF neural network control of robot with actuator nonlinearities
    Liu J.
    Lu Y.
    Journal of Control Theory and Applications, 2010, 8 (2): : 249 - 256
  • [35] A NOVEL APPROACH FOR ANALYZING THE SOCIAL NETWORK
    Balaguru, Saranya
    Nallathamby, Rachel
    Robin, C. R. Rene
    INTERNATIONAL CONFERENCE ON COMPUTER, COMMUNICATION AND CONVERGENCE (ICCC 2015), 2015, 48 : 686 - 691
  • [36] Modeling urban expansion by integrating a convolutional neural network and a recurrent neural network
    Pan, Xinhao
    Liu, Zhifeng
    He, Chunyang
    Huang, Qingxu
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2022, 112
  • [37] A novel fuzzy neural network: The vague neural network
    Fang, R
    Zhao, YB
    Li, WS
    ICCI 2005: Fourth IEEE International Conference on Cognitive Informatics - Proceedings, 2005, : 94 - 99
  • [38] Model order reduction for real-time hybrid simulation: Comparing polynomial chaos expansion and neural network methods
    Tsokanas, N.
    Simpson, T.
    Pastorino, R.
    Chatzi, E.
    Stojadinovic, B.
    MECHANISM AND MACHINE THEORY, 2022, 178
  • [39] Neural network and memory polynomial methodologies for PA modeling
    Ahmed, A
    Srinidhi, ER
    Kompa, G
    Telsiks 2005, Proceedings, Vols 1 and 2, 2005, : 393 - 396
  • [40] Advanced self-organizing polynomial neural network
    Dongwon Kim
    Gwi-Tae Park
    Neural Computing and Applications, 2007, 16 : 443 - 452