Mini-batch descent in semiflows

被引:0
|
作者
Corella, Alberto Dominguez
Hernandez, Martin [1 ]
机构
[1] Friedrich Alexander Univ Erlangen Nurnberg, Dynam Control Machine Learning & Numer, D-91058 Erlangen, Germany
关键词
Gradient flow; mini-batch; stochastic gradient descent; domain decomposition; OBSTACLE; OPTIMIZATION;
D O I
10.1051/cocv/2025018
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper investigates the application of mini-batch gradient descent to semiflows (gradient flows). Given a loss function (potential), we introduce a continuous version of mini-batch gradient descent by randomly selecting sub-loss functions over time, defining a piecewise flow. We prove that, under suitable assumptions on the potential generating the semiflow, the mini-batch descent flow trajectory closely approximates the original semiflow trajectory on average. In addition, we study a randomized minimizing movement scheme that also approximates the semiflow of the full loss function. We illustrate the versatility of this approach across various problems, including constrained optimization, sparse inversion, and domain decomposition. Finally, we validate our results with several numerical examples.
引用
收藏
页数:31
相关论文
共 50 条
  • [31] Mini-batch stochastic subgradient for functional constrained optimization
    Singh, Nitesh Kumar
    Necoara, Ion
    Kungurtsev, Vyacheslav
    OPTIMIZATION, 2024, 73 (07) : 2159 - 2185
  • [32] Breast cancer detection using Histopathology Image with Mini-Batch Stochastic Gradient Descent and Convolutional Neural Network
    Sasirekha, N.
    Karuppaiah, Jayakumar
    Shekhar, Himanshu
    Saranya, N. Naga
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (03) : 4651 - 4667
  • [33] Gaussian Process Parameter Estimation Using Mini-batch Stochastic Gradient Descent: Convergence Guarantees and Empirical Benefits
    Chen, Hao
    Zheng, Lili
    Al Kontar, Raed
    Raskutti, Garvesh
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [34] Accelerating mini-batch SARAH by step size rules
    Yang, Zhuang
    Chen, Zengping
    Wang, Cheng
    INFORMATION SCIENCES, 2021, 558 (558) : 157 - 173
  • [35] Stronger Adversarial Attack: Using Mini-batch Gradient
    Yu, Lin
    Deng, Ting
    Zhang, Wenxiang
    Zeng, Zhigang
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 364 - 370
  • [36] Research on Mini-Batch Affinity Propagation Clustering Algorithm
    Xu, Ziqi
    Lu, Yahui
    Jiang, Yu
    2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 86 - 95
  • [37] An Asynchronous Mini-batch Algorithm for Regularized Stochastic Optimization
    Feyzmandavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    2015 54TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2015, : 1384 - 1389
  • [38] Fixing Mini-batch Sequences with Hierarchical Robust Partitioning
    Wang, Shengjie
    Bai, Wenruo
    Lavania, Chandrashekhar
    Bilmes, Jeffrey A.
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [39] Gradient preconditioned mini-batch SGD for ridge regression
    Zhang, Zhuan
    Zhou, Shuisheng
    Li, Dong
    Yang, Ting
    NEUROCOMPUTING, 2020, 413 : 284 - 293
  • [40] An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
    Feyzmahdavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (12) : 3740 - 3754