Online convex optimization using coordinate descent algorithms

被引:0
|
作者
Lin, Yankai [1 ]
Shames, Iman [2 ]
Nesic, Dragan [3 ]
机构
[1] Eindhoven Univ Technol, Dept Mech Engn, Eindhoven, Netherlands
[2] Australian Natl Univ, Sch Engn, CIICADA Lab, Acton, ACT 0200, Australia
[3] Univ Melbourne, Dept Elect & Elect Engn, Parkville, Vic 3010, Australia
基金
澳大利亚研究理事会;
关键词
Online convex optimization; Coordinate descent; Online learning; Regret minimization;
D O I
10.1016/j.automatica.2024.111681
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers the problem of online optimization where the objective function is time-varying. In particular, we extend coordinate descent type algorithms to the online case, where the objective function varies after a finite number of iterations of the algorithm. Instead of solving the problem exactly at each time step, we only apply a finite number of iterations at each time step. Commonly used notions of regret are used to measure the performance of the online algorithm. Moreover, coordinate descent algorithms with different updating rules are considered, including both deterministic and stochastic rules that are developed in the literature of classical offline optimization. A thorough regret analysis is given for each case. Finally, numerical simulations are provided to illustrate the theoretical results. (c) 2024 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Deterministic Coordinate Descent Algorithms for Smooth Convex Optimization
    Wu, Xuyang
    Lu, Jie
    2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2017,
  • [2] Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization
    Alacaoglu, Ahmet
    Tran-Dinh, Quoc
    Fercoq, Olivier
    Cevher, Volkan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [3] Accelerated Randomized Coordinate Descent Algorithms for Stochastic Optimization and Online Learning
    Bhandari, Akshita
    Singh, Chandramani
    LEARNING AND INTELLIGENT OPTIMIZATION, LION 12, 2019, 11353 : 1 - 15
  • [4] Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks
    Necoara, Ion
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2013, 58 (08) : 2001 - 2012
  • [5] Rapid Convex Optimization of Centroidal Dynamics using Block Coordinate Descent
    Shah, Paarth
    Meduri, Avadesh
    Merkt, Wolfgang
    Khadiv, Majid
    Havoutis, Ioannis
    Righetti, Ludovic
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 1658 - 1665
  • [6] Coordinate descent algorithms
    Wright, Stephen J.
    MATHEMATICAL PROGRAMMING, 2015, 151 (01) : 3 - 34
  • [7] Coordinate descent algorithms
    Stephen J. Wright
    Mathematical Programming, 2015, 151 : 3 - 34
  • [8] ITERATION COMPLEXITY OF A BLOCK COORDINATE GRADIENT DESCENT METHOD FOR CONVEX OPTIMIZATION
    Hua, Xiaoqin
    Yamashita, Nobuo
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (03) : 1298 - 1313
  • [9] Dual Coordinate Descent Algorithms for Multi-agent Optimization
    Lu, Jie
    Feyzmahdavian, Hamid Reza
    Johansson, Mikael
    2015 EUROPEAN CONTROL CONFERENCE (ECC), 2015, : 715 - 720
  • [10] Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization
    Yang, Yang
    Pesavento, Marius
    Luo, Zhi-Quan
    Ottersten, Bjorn
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 947 - 961