共 50 条
Conjugate Gradient and Variance Reduction Based Online ADMM for Low-Rank Distributed Networks
被引:0
|作者:
Chen, Yitong
[1
]
Jin, Danqi
[2
]
Chen, Jie
[1
]
Richard, Cedric
[3
]
Zhang, Wen
[1
]
机构:
[1] Northwestern Polytech Univ, Ctr Intelligent Acoust & Immers Commun, Sch Marine Sci & Technol, Xian 710071, Peoples R China
[2] Wuhan Univ, Sch Elect Informat, Wuhan 430072, Peoples R China
[3] Univ Cote Dazur, CNRS, OCA, F-06000 Nice, France
基金:
中国国家自然科学基金;
关键词:
ADMM;
conjugate gradient descent;
distributed optimization;
low-rank;
variance reduction;
DIFFUSION ADAPTATION;
STRATEGIES;
APPROXIMATION;
COMBINATION;
ALGORITHM;
SPARSE;
D O I:
10.1109/LSP.2025.3531200
中图分类号:
TM [电工技术];
TN [电子技术、通信技术];
学科分类号:
0808 ;
0809 ;
摘要:
Modeling the relationships that may connect optimal parameter vectors is essential for the performance of parameter estimation methods in distributed networks. In this paper, we consider a low-rank relationship and introduce matrix factorization to promote this low-rank property. To devise a distributed algorithm that does not require any prior knowledge about the low-rank space, we first formulate local optimization problems at each node, which are subsequently addressed using the Alternating Direction Method of Multipliers (ADMM). Three subproblems naturally arise from ADMM, each resolved in an online manner with low computational costs. Specifically, the first one is solved using stochastic gradient descent (SGD), while the other two are handled using the conjugate gradient descent method to avoid matrix inversion operations. To further enhance performance, a variance reduction algorithm is incorporated into the SGD. Simulation results validate the effectiveness of the proposed algorithm.
引用
收藏
页码:706 / 710
页数:5
相关论文