Evaluation of Gradient Descent Optimization: Using Android Applications in Neural Networks

被引:2
|
作者
Alshahrani, Hani [1 ]
Alzahrani, Abdulrahman [1 ]
Alshehri, Ali [1 ]
Alharthi, Raed [1 ]
Fu, Huirong [1 ]
机构
[1] Oakland Univ, Sch Engn & Comp Sci, Rochester, MI 48309 USA
基金
美国国家科学基金会;
关键词
neural networks; gradient descent optimizers; loss function; Android;
D O I
10.1109/CSCI.2017.257
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural networks have gained prominence by being used in different aspects such as medical diagnosis and detecting malware applications. However, neural network models could have an error rate that indicates their performance. Thus, optimization algorithms can minimize the error rate by updating the neural network parameters to reach an optimal solution. This paper explores the use of permissions and underlying Linux system information features in Android platform to evaluate gradient descent optimization algorithms in neural networks. Those optimizers are evaluated by running them on a set of Android applications to find the optimum one. Furthermore, each optimizer is assessed based on its default and adjusted parameters values. This evaluation shows that the best accuracy score is 92.21% collected by Adam optimizer.
引用
收藏
页码:1471 / 1476
页数:6
相关论文
共 50 条
  • [1] Optimization of Graph Neural Networks with Natural Gradient Descent
    Izadi, Mohammad Rasool
    Fang, Yihao
    Stevenson, Robert
    Lin, Lizhen
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 171 - 179
  • [2] Learning dynamics of gradient descent optimization in deep neural networks
    Wu, Wei
    Jing, Xiaoyuan
    Du, Wencai
    Chen, Guoliang
    SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (05)
  • [3] Learning dynamics of gradient descent optimization in deep neural networks
    Wei Wu
    Xiaoyuan Jing
    Wencai Du
    Guoliang Chen
    Science China Information Sciences, 2021, 64
  • [4] Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks
    Cui, Xiaodong
    Zhang, Wei
    Tuske, Zoltan
    Picheny, Michael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Learning dynamics of gradient descent optimization in deep neural networks
    Wei WU
    Xiaoyuan JING
    Wencai DU
    Guoliang CHEN
    ScienceChina(InformationSciences), 2021, 64 (05) : 17 - 31
  • [6] Optimization of learning process for Fourier series neural networks using gradient descent algorithm
    Halawa, Krzysztof
    PRZEGLAD ELEKTROTECHNICZNY, 2008, 84 (06): : 128 - 130
  • [7] Using Particle Swarm Optimization with Gradient Descent for Parameter Learning in Convolutional Neural Networks
    Wessels, Steven
    van der Haar, Dustin
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2021, 2021, 12702 : 119 - 128
  • [8] Strengthening Gradient Descent by Sequential Motion Optimization for Deep Neural Networks
    Le-Duc, Thang
    Nguyen, Quoc-Hung
    Lee, Jaehong
    Nguyen-Xuan, H.
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2023, 27 (03) : 565 - 579
  • [9] INVERSION OF NEURAL NETWORKS BY GRADIENT DESCENT
    KINDERMANN, J
    LINDEN, A
    PARALLEL COMPUTING, 1990, 14 (03) : 277 - 286
  • [10] Gradient Descent for Spiking Neural Networks
    Huh, Dongsung
    Sejnowski, Terrence J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31