GAAF: Searching Activation Functions for Binary Neural Networks Through Genetic Algorithm

被引:2
|
作者
Li, Yanfei [1 ]
Geng, Tong [2 ]
Stein, Samuel [2 ]
Li, Ang [2 ]
Yu, Huimin [1 ]
机构
[1] Zhejiang Univ, Dept Informat Sci & Elect Engn, Hangzhou 310027, Peoples R China
[2] Pacific Northwest Natl Lab, Richland, WA 99354 USA
来源
TSINGHUA SCIENCE AND TECHNOLOGY | 2023年 / 28卷 / 01期
关键词
binary neural networks (BNNs); genetic algorithm; activation function;
D O I
10.26599/TST.2021.9010084
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Binary neural networks (BNNs) show promising utilization in cost and power-restricted domains such as edge devices and mobile systems. This is due to its significantly less computation and storage demand, but at the cost of degraded performance. To close the accuracy gap, in this paper we propose to add a complementary activation function (AF) ahead of the sign based binarization, and rely on the genetic algorithm (GA) to automatically search for the ideal AFs. These AFs can help extract extra information from the input data in the forward pass, while allowing improved gradient approximation in the backward pass. Fifteen novel AFs are identified through our GA-based search, while most of them show improved performance (up to 2.54% on ImageNet) when testing on different datasets and network models. Interestingly, periodic functions are identified as a key component for most of the discovered AFs, which rarely exist in human designed AFs. Our method offers a novel approach for designing general and application-specific BNN architecture. GAAF will be released on GitHub.
引用
收藏
页码:207 / 220
页数:14
相关论文
共 50 条
  • [21] A Comparison of Activation Functions in Artificial Neural Networks
    Bircanoglu, Cenk
    Arica, Nafiz
    2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [22] HOLDER CONTINUOUS ACTIVATION FUNCTIONS IN NEURAL NETWORKS
    Tatar, Nasser-Eddine
    ADVANCES IN DIFFERENTIAL EQUATIONS AND CONTROL PROCESSES, 2015, 15 (02): : 93 - 106
  • [23] Deep Neural Networks with Multistate Activation Functions
    Cai, Chenghao
    Xu, Yanyan
    Ke, Dengfeng
    Su, Kaile
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2015, 2015
  • [24] Activation Functions and Their Characteristics in Deep Neural Networks
    Ding, Bin
    Qian, Huimin
    Zhou, Jun
    PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 1836 - 1841
  • [25] Learning Activation Functions for Sparse Neural Networks
    Loni, Mohammad
    Mohan, Aditya
    Asadi, Mehdi
    Lindauer, Marius
    INTERNATIONAL CONFERENCE ON AUTOMATED MACHINE LEARNING, VOL 224, 2023, 224
  • [26] Quantum activation functions for quantum neural networks
    Maronese, Marco
    Destri, Claudio
    Prati, Enrico
    QUANTUM INFORMATION PROCESSING, 2022, 21 (04)
  • [27] Stochastic Neural Networks with Monotonic Activation Functions
    Ravanbakhsh, Siamak
    Poczos, Barnabas
    Schneider, Jeff
    Schuurmans, Dale
    Greiner, Russell
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 809 - 818
  • [28] Survey on Activation Functions for Optical Neural Networks
    Destras, Oceane
    Le Beux, Sebastien
    de Magalhaes, Felipe Gohring
    Nicolescu, Gabriela
    ACM COMPUTING SURVEYS, 2024, 56 (02)
  • [29] Fractional Adaptation of Activation Functions In Neural Networks
    Zamora Esquivel, Julio
    Cruz Vargas, Jesus Adan
    Lopez-Meyer, Paulo
    Cordourier Maruri, Hector Alfonso
    Camacho Perez, Jose Rodrigo
    Tickoo, Omesh
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7544 - 7550
  • [30] Bicomplex Neural Networks with Hypergeometric Activation Functions
    Vieira, Nelson
    ADVANCES IN APPLIED CLIFFORD ALGEBRAS, 2023, 33 (02)