Evaluation of Gradient Descent Optimization: Using Android Applications in Neural Networks

被引:2
|
作者
Alshahrani, Hani [1 ]
Alzahrani, Abdulrahman [1 ]
Alshehri, Ali [1 ]
Alharthi, Raed [1 ]
Fu, Huirong [1 ]
机构
[1] Oakland Univ, Sch Engn & Comp Sci, Rochester, MI 48309 USA
基金
美国国家科学基金会;
关键词
neural networks; gradient descent optimizers; loss function; Android;
D O I
10.1109/CSCI.2017.257
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural networks have gained prominence by being used in different aspects such as medical diagnosis and detecting malware applications. However, neural network models could have an error rate that indicates their performance. Thus, optimization algorithms can minimize the error rate by updating the neural network parameters to reach an optimal solution. This paper explores the use of permissions and underlying Linux system information features in Android platform to evaluate gradient descent optimization algorithms in neural networks. Those optimizers are evaluated by running them on a set of Android applications to find the optimum one. Furthermore, each optimizer is assessed based on its default and adjusted parameters values. This evaluation shows that the best accuracy score is 92.21% collected by Adam optimizer.
引用
收藏
页码:1471 / 1476
页数:6
相关论文
共 50 条
  • [41] Convergence rates for shallow neural networks learned by gradient descent
    Braun, Alina
    Kohler, Michael
    Langer, Sophie
    Walk, Harro
    BERNOULLI, 2024, 30 (01) : 475 - 502
  • [42] Time delay learning by gradient descent in Recurrent Neural Networks
    Boné, R
    Cardot, H
    ARTIFICIAL NEURAL NETWORKS: FORMAL MODELS AND THEIR APPLICATIONS - ICANN 2005, PT 2, PROCEEDINGS, 2005, 3697 : 175 - 180
  • [43] Fast Convergence of Natural Gradient Descent for Overparameterized Neural Networks
    Zhang, Guodong
    Martens, James
    Grosse, Roger
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [44] Gradient Descent Analysis: On Visualizing the Training of Deep Neural Networks
    Becker, Martin
    Lippel, Jens
    Zielke, Thomas
    PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL 3: IVAPP, 2019, : 338 - 345
  • [45] Smooth Exact Gradient Descent Learning in Spiking Neural Networks
    Klos, Christian
    Memmesheimer, Raoul-Martin
    PHYSICAL REVIEW LETTERS, 2025, 134 (02)
  • [46] Gradient Descent Finds Global Minima of Deep Neural Networks
    Du, Simon S.
    Lee, Jason D.
    Li, Haochuan
    Wang, Liwei
    Zhai, Xiyu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [47] Gradient descent learning of radial-basis neural networks
    Karayiannis, NB
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 1815 - 1820
  • [48] Symmetry-guided gradient descent for quantum neural networks
    Bian, Kaiming
    Zhang, Shitao
    Meng, Fei
    Zhang, Wen
    Dahlsten, Oscar
    PHYSICAL REVIEW A, 2024, 110 (02)
  • [49] Gradient evaluation for neural-networks-based electromagnetic optimization procedures
    Antonini, G
    Orlandi, A
    IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES, 2000, 48 (05) : 874 - 876
  • [50] A GEOMETRIC APPROACH OF GRADIENT DESCENT ALGORITHMS IN LINEAR NEURAL NETWORKS
    Chitour, Yacine
    Liao, Zhenyu
    Couillet, Romain
    MATHEMATICAL CONTROL AND RELATED FIELDS, 2023, 13 (03) : 918 - 945