Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function

被引:105
|
作者
Zamanlooy, Babak [1 ]
Mirhassani, Mitra [1 ]
机构
[1] Univ Windsor, Dept Elect & Comp Engn, Windsor, ON N9B 3P4, Canada
关键词
Hyperbolic tangent; neural networks; nonlinear activation function; VLSI implementation; SIGMOID FUNCTION; HARDWARE IMPLEMENTATION; GENERATORS; DESIGN;
D O I
10.1109/TVLSI.2012.2232321
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions. Accurate implementation of these transfer functions in digital networks faces certain challenges. In this paper, an efficient approximation scheme for hyperbolic tangent function is proposed. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. Hardware implementation of the proposed approximation scheme is presented, which shows that the proposed structure compares favorably with previous architectures in terms of area and delay. The proposed structure requires less output bits for the same maximum allowable error when compared to the state-of-the-art. The number of output bits of the activation function determines the bit width of multipliers and adders in the network. Therefore, the proposed activation function results in reduction in area, delay, and power in VLSI implementation of artificial neural networks with hyperbolic tangent activation function.
引用
收藏
页码:39 / 48
页数:10
相关论文
共 50 条
  • [21] Efficiently inaccurate approximation of hyperbolic tangent used as transfer function in artificial neural networks
    Simos, T. E.
    Tsitouras, Ch.
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (16): : 10227 - 10233
  • [22] ANALOG VLSI IMPLEMENTATION OF NEURAL NETWORKS
    VITTOZ, E
    ARTIFICIAL NEURAL NETWORKS : JOURNEES DELECTRONIQUE 1989, 1989, : 223 - 250
  • [23] Optimal VLSI implementation of neural networks
    Beiu, V
    NEURAL NETWORKS AND THEIR APPLICATIONS, 1996, : 255 - 276
  • [24] VLSI implementation of pulsating neural networks
    Schwartzglass, O
    Agranat, AJ
    NEUROCOMPUTING, 1996, 10 (04) : 405 - 413
  • [25] ANALOG VLSI IMPLEMENTATION OF NEURAL NETWORKS
    VERLEYSEN, M
    JESPERS, P
    ARTIFICIAL NEURAL NETWORKS : JOURNEES DELECTRONIQUE 1989, 1989, : 279 - 289
  • [26] Hyperbolic Hopfield neural networks with directional multistate activation function
    Kobayashi, Masaki
    NEUROCOMPUTING, 2018, 275 : 2217 - 2226
  • [27] Artificial Neural Network with Hyperbolic Tangent Activation Function to Improve the Accuracy of COCOMO II Model
    Alshalif, Sarah Abdulkarem
    Ibrahim, Noraini
    Herawan, Tutut
    RECENT ADVANCES ON SOFT COMPUTING AND DATA MINING, 2017, 549 : 81 - 90
  • [28] Very High Accuracy Hyperbolic Tangent Function Implementation in FPGAs
    Hajduk, Zbigniew
    Dec, Grzegorz Rafal
    IEEE ACCESS, 2023, 11 : 23701 - 23713
  • [29] Stochastic Implementation of the Activation Function for Artificial Neural Networks
    Yeo, Injune
    Gi, Sang-gyun
    Lee, Byung-geun
    Chu, Myonglae
    PROCEEDINGS OF 2016 IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE (BIOCAS), 2016, : 440 - 443
  • [30] A Quantum Activation Function for Neural Networks: Proposal and Implementation
    Kumar, Saurabh
    Dangwal, Siddharth
    Adhikary, Soumik
    Bhowmik, Debanjan
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,