Online convex optimization using coordinate descent algorithms

被引:0
|
作者
Lin, Yankai [1 ]
Shames, Iman [2 ]
Nesic, Dragan [3 ]
机构
[1] Eindhoven Univ Technol, Dept Mech Engn, Eindhoven, Netherlands
[2] Australian Natl Univ, Sch Engn, CIICADA Lab, Acton, ACT 0200, Australia
[3] Univ Melbourne, Dept Elect & Elect Engn, Parkville, Vic 3010, Australia
基金
澳大利亚研究理事会;
关键词
Online convex optimization; Coordinate descent; Online learning; Regret minimization;
D O I
10.1016/j.automatica.2024.111681
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers the problem of online optimization where the objective function is time-varying. In particular, we extend coordinate descent type algorithms to the online case, where the objective function varies after a finite number of iterations of the algorithm. Instead of solving the problem exactly at each time step, we only apply a finite number of iterations at each time step. Commonly used notions of regret are used to measure the performance of the online algorithm. Moreover, coordinate descent algorithms with different updating rules are considered, including both deterministic and stochastic rules that are developed in the literature of classical offline optimization. A thorough regret analysis is given for each case. Finally, numerical simulations are provided to illustrate the theoretical results. (c) 2024 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms
    Hazimeh, Hussein
    Mazumder, Rahul
    OPERATIONS RESEARCH, 2020, 68 (05) : 1517 - 1537
  • [32] Using Big Steps in Coordinate Descent Primal-Dual Algorithms
    Bianchi, Pascal
    Fercoq, Olivier
    2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC), 2016, : 1895 - 1899
  • [33] A unified approach to statistical tomography using coordinate descent optimization
    Bouman, CA
    Sauer, K
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 1996, 5 (03) : 480 - 492
  • [34] Gradient-free algorithms for distributed online convex optimization
    Liu, Yuhang
    Zhao, Wenxiao
    Dong, Daoyi
    ASIAN JOURNAL OF CONTROL, 2023, 25 (04) : 2451 - 2468
  • [35] ON THE CONVERGENCE OF THE COORDINATE DESCENT METHOD FOR CONVEX DIFFERENTIABLE MINIMIZATION
    LUO, ZQ
    TSENG, P
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1992, 72 (01) : 7 - 35
  • [36] CONVERGENCE RATE ANALYSIS of RANDOMIZED and CYCLIC COORDINATE DESCENT for CONVEX OPTIMIZATION THROUGH SEMIDEFINITE PROGRAMMING
    Abbaszadehpeivasti H.
    De Klerk E.
    Zamani M.
    Applied Set-Valued Analysis and Optimization, 2023, 5 (02): : 141 - 153
  • [37] AN INEXACT COORDINATE DESCENT METHOD FOR THE WEIGHTED l1-REGULARIZED CONVEX OPTIMIZATION PROBLEM
    Hua, Xiaoqin
    Yamashita, Nobuo
    PACIFIC JOURNAL OF OPTIMIZATION, 2013, 9 (04): : 567 - 594
  • [38] Solving norm constrained portfolio optimization via coordinate-wise descent algorithms
    Yen, Yu-Min
    Yen, Tso-Jung
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 76 : 737 - 759
  • [39] Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
    Patrascu, Andrei
    Necoara, Ion
    JOURNAL OF GLOBAL OPTIMIZATION, 2015, 61 (01) : 19 - 46
  • [40] Nested coordinate descent algorithms for empirical likelihood
    Tang, Cheng Yong
    Wu, Tong Tong
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2014, 84 (09) : 1917 - 1930