Adaptive Stochastic Gradient Descent (SGD) for erratic datasets

被引:2
|
作者
Dagal, Idriss [1 ]
Tanrioven, Kursat [1 ]
Nayir, Ahmet [1 ]
Akin, Burak [2 ]
机构
[1] Istanbul Beykent Univ, Elect Engn, Hadim Koruyolu Caddesi 19, TR-34450 Istanbul, Turkiye
[2] Yildiz Tech Univ, Elect Engn, Davutpasa Caddesi, TR-34220 Istanbul, Turkiye
关键词
Gradient descent; Stochastic Gradient Descent; Accuracy; Principal Component Analysis; QUASI-NEWTON METHOD; NEURAL NETWORKS; ALGORITHM; MLP;
D O I
10.1016/j.future.2024.107682
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Stochastic Gradient Descent (SGD) is a highly efficient optimization algorithm, particularly well suited for large datasets due to its incremental parameter updates. In this study, we apply SGD to a simple linear classifier using logistic regression, a widely used method for binary classification tasks. Unlike traditional batch Gradient Descent (GD), which processes the entire dataset simultaneously, SGD offers enhanced scalability and performance for streaming and large-scale data. Our experiments reveal that SGD outperforms GD across multiple performance metrics, achieving 45.83% accuracy compared to GD's 41.67 %, and excelling in precision (60 % vs. 45.45 %), recall (100 % vs. 60 %), and F1-score (100 % vs. 62 %). Additionally, SGD achieves 99.99 % of Principal Component Analysis (PCA) accuracy, slightly surpassing GD's 99.92 %. These results highlight SGD's superior efficiency and flexibility for large-scale data environments, driven by its ability to balance precision and recall effectively. To further enhance SGD's robustness, the proposed method incorporates adaptive learning rates, momentum, and logistic regression, addressing traditional GD drawbacks. These modifications improve the algorithm's stability, convergence behavior, and applicability to complex, large-scale optimization tasks where standard GD often struggles, making SGD a highly effective solution for challenging data-driven scenarios.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Differential Privacy Stochastic Gradient Descent with Adaptive Privacy Budget Allocation
    Xie, Yun
    Li, Peng
    Wu, Chao
    Wu, Qiuling
    2021 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS AND COMPUTER ENGINEERING (ICCECE), 2021, : 227 - 231
  • [42] Adaptive wavefront control with asynchronous stochastic parallel gradient descent clusters
    Vorontsov, Mikhail A.
    Carhart, Gary W.
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2006, 23 (10) : 2613 - 2622
  • [43] Unforgeability in Stochastic Gradient Descent
    Baluta, Teodora
    Nikolic, Ivica
    Jain, Racchit
    Aggarwal, Divesh
    Saxena, Prateek
    PROCEEDINGS OF THE 2023 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, CCS 2023, 2023, : 1138 - 1152
  • [44] Preconditioned Stochastic Gradient Descent
    Li, Xi-Lin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) : 1454 - 1466
  • [45] Stochastic Reweighted Gradient Descent
    El Hanchi, Ayoub
    Stephens, David A.
    Maddison, Chris J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [46] Stochastic gradient descent tricks
    Bottou, Léon
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2012, 7700 LECTURE NO : 421 - 436
  • [47] Byzantine Stochastic Gradient Descent
    Alistarh, Dan
    Allen-Zhu, Zeyuan
    Li, Jerry
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [48] Using Adaboost and Stochastic gradient descent (SGD) Algorithms with R and Orange Software for Filtering E-mail Spam
    Elshoush, Huwaida T.
    Dinar, Esraa A.
    2019 11TH COMPUTER SCIENCE AND ELECTRONIC ENGINEERING (CEEC), 2019, : 41 - 46
  • [49] Distributed Stochastic Optimization via Adaptive SGD
    Cutkosky, Ashok
    Busa-Fekete, Robert
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [50] An Improvised Sentiment Analysis Model on Twitter Data Using Stochastic Gradient Descent (SGD) Optimization Algorithm in Stochastic Gate Neural Network (SGNN)
    Vidyashree K.P.
    Rajendra A.B.
    SN Computer Science, 4 (2)