Evaluation of Gradient Descent Optimization: Using Android Applications in Neural Networks

被引:2
|
作者
Alshahrani, Hani [1 ]
Alzahrani, Abdulrahman [1 ]
Alshehri, Ali [1 ]
Alharthi, Raed [1 ]
Fu, Huirong [1 ]
机构
[1] Oakland Univ, Sch Engn & Comp Sci, Rochester, MI 48309 USA
基金
美国国家科学基金会;
关键词
neural networks; gradient descent optimizers; loss function; Android;
D O I
10.1109/CSCI.2017.257
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural networks have gained prominence by being used in different aspects such as medical diagnosis and detecting malware applications. However, neural network models could have an error rate that indicates their performance. Thus, optimization algorithms can minimize the error rate by updating the neural network parameters to reach an optimal solution. This paper explores the use of permissions and underlying Linux system information features in Android platform to evaluate gradient descent optimization algorithms in neural networks. Those optimizers are evaluated by running them on a set of Android applications to find the optimum one. Furthermore, each optimizer is assessed based on its default and adjusted parameters values. This evaluation shows that the best accuracy score is 92.21% collected by Adam optimizer.
引用
收藏
页码:1471 / 1476
页数:6
相关论文
共 50 条
  • [21] Fast batch gradient descent in quantum neural networks
    Shim, Joo Yong
    Kim, Joongheon
    ELECTRONICS LETTERS, 2025, 61 (01)
  • [22] A Convergence Analysis of Gradient Descent on Graph Neural Networks
    Awasthi, Pranjal
    Das, Abhimanyu
    Gollapudi, Sreenivas
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [23] Learning Graph Neural Networks with Approximate Gradient Descent
    Li, Qunwei
    Zou, Shaofeng
    Zhong, Wenliang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8438 - 8446
  • [24] Calibrated Stochastic Gradient Descent for Convolutional Neural Networks
    Zhuo, Li'an
    Zhang, Baochang
    Chen, Chen
    Ye, Qixiang
    Liu, Jianzhuang
    Doermann, David
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9348 - 9355
  • [25] Gradient descent learning for quaternionic Hopfield neural networks
    Kobayashi, Masaki
    NEUROCOMPUTING, 2017, 260 : 174 - 179
  • [26] Analysis of natural gradient descent for multilayer neural networks
    Rattray, M
    Saad, D
    PHYSICAL REVIEW E, 1999, 59 (04): : 4523 - 4532
  • [27] A gradient descent learning algorithm for fuzzy neural networks
    Feuring, T
    Buckley, JJ
    Hayashi, Y
    1998 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AT THE IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE - PROCEEDINGS, VOL 1-2, 1998, : 1136 - 1141
  • [28] Generalization Guarantees of Gradient Descent for Shallow Neural Networks
    Wang, Puyu
    Lei, Yunwen
    Wang, Di
    Ying, Yiming
    Zhou, Ding-Xuan
    NEURAL COMPUTATION, 2025, 37 (02) : 344 - 402
  • [29] Convergence of gradient descent for learning linear neural networks
    Nguegnang, Gabin Maxime
    Rauhut, Holger
    Terstiege, Ulrich
    ADVANCES IN CONTINUOUS AND DISCRETE MODELS, 2024, 2024 (01):
  • [30] Fractional Gradient Descent Method for Spiking Neural Networks
    Yang, Honggang
    Chen, Jiejie
    Jiang, Ping
    Xu, Mengfei
    Zhao, Haiming
    2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA, 2023, : 636 - 641