Evaluation of Gradient Descent Optimization: Using Android Applications in Neural Networks

被引:2
|
作者
Alshahrani, Hani [1 ]
Alzahrani, Abdulrahman [1 ]
Alshehri, Ali [1 ]
Alharthi, Raed [1 ]
Fu, Huirong [1 ]
机构
[1] Oakland Univ, Sch Engn & Comp Sci, Rochester, MI 48309 USA
基金
美国国家科学基金会;
关键词
neural networks; gradient descent optimizers; loss function; Android;
D O I
10.1109/CSCI.2017.257
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural networks have gained prominence by being used in different aspects such as medical diagnosis and detecting malware applications. However, neural network models could have an error rate that indicates their performance. Thus, optimization algorithms can minimize the error rate by updating the neural network parameters to reach an optimal solution. This paper explores the use of permissions and underlying Linux system information features in Android platform to evaluate gradient descent optimization algorithms in neural networks. Those optimizers are evaluated by running them on a set of Android applications to find the optimum one. Furthermore, each optimizer is assessed based on its default and adjusted parameters values. This evaluation shows that the best accuracy score is 92.21% collected by Adam optimizer.
引用
收藏
页码:1471 / 1476
页数:6
相关论文
共 50 条
  • [31] Understanding the Convolutional Neural Networks with Gradient Descent and Backpropagation
    Zhou, XueFei
    2ND INTERNATIONAL CONFERENCE ON MACHINE VISION AND INFORMATION TECHNOLOGY (CMVIT 2018), 2018, 1004
  • [32] Neural Networks can Learn Representations with Gradient Descent
    Damian, Alex
    Lee, Jason D.
    Soltanolkotabi, Mahdi
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [33] Gradient Descent Optimization for Routing in Multistage Interconnection Networks
    Ghaziasgar, Mehran
    Naeini, Armin Tavakoli
    Advances in Computational Intelligence, IWANN 2011, Pt I, 2011, 6691 : 215 - 222
  • [34] An automatic learning rate decay strategy for stochastic gradient descent optimization methods in neural networks
    Wang, Kang
    Dou, Yong
    Sun, Tao
    Qiao, Peng
    Wen, Dong
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (10) : 7334 - 7355
  • [35] Efficient Optimization of Neural Networks for Predictive Hiring: An In-Depth Approach to Stochastic Gradient Descent
    Temsamani, Yassine Khallouk
    Achchab, Said
    2024 5TH INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKS AND INTERNET OF THINGS, CNIOT 2024, 2024, : 588 - 594
  • [36] Success prediction of android applications in a novel repository using neural networks
    Dehkordi, Mehrdad Razavi
    Seifzadeh, Habib
    Beydoun, Ghassan
    Nadimi-Shahraki, Mohammad H.
    COMPLEX & INTELLIGENT SYSTEMS, 2020, 6 (03) : 573 - 590
  • [37] Success prediction of android applications in a novel repository using neural networks
    Mehrdad Razavi Dehkordi
    Habib Seifzadeh
    Ghassan Beydoun
    Mohammad H. Nadimi-Shahraki
    Complex & Intelligent Systems, 2020, 6 : 573 - 590
  • [38] Android applications classification with deep neural networks
    Mustapha Adamu Mohammed
    Michael Asante
    Seth Alornyo
    Bernard Obo Essah
    Iran Journal of Computer Science, 2023, 6 (3) : 221 - 232
  • [39] Convergence of Gradient Descent Algorithm for Diagonal Recurrent Neural Networks
    Xu, Dongpo
    Li, Zhengxue
    Wu, Wei
    Ding, Xiaoshuai
    Qu, Di
    2007 SECOND INTERNATIONAL CONFERENCE ON BIO-INSPIRED COMPUTING: THEORIES AND APPLICATIONS, 2007, : 29 - 31