Two Recurrent Neural Networks With Reduced Model Complexity for Constrained l1-Norm Optimization

被引:7
|
作者
Xia, Youshen [1 ]
Wang, Jun [2 ]
Lu, Zhenyu [3 ]
Huang, Liqing [4 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Coll Artificial Intelligence, Nanjing 211544, Peoples R China
[2] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[3] Nanjing Univ Informat Sci & Technol, Jiangsu Key Lab Meteorol Observat & Informat Proc, Nanjing 210044, Peoples R China
[4] Fujian Normal Univ, Coll Math & Informat, Fuzhou 350117, Peoples R China
基金
中国国家自然科学基金;
关键词
Fast computation; linearly constrained l(1)-norm optimization; model complexity; recurrent neural network (RNN); L-1 ESTIMATION PROBLEMS; EQUATIONS; SYSTEMS;
D O I
10.1109/TNNLS.2021.3133836
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Because of the robustness and sparsity performance of least absolute deviation (LAD or l(1)) optimization, developing effective solution methods becomes an important topic. Recurrent neural networks (RNNs) are reported to be capable of effectively solving constrained l(1)-norm optimization problems, but their convergence speed is limited. To accelerate the convergence, this article introduces two RNNs, in form of continuous- and discrete-time systems, for solving l(1)-norm optimization problems with linear equality and inequality constraints. The RNNs are theoretically proven to be globally convergent to optimal solutions without any condition. With reduced model complexity, the two RNNs can significantly expedite constrained l(1)-norm optimization. Numerical simulation results show that the two RNNs spend much less computational time than related RNNs and numerical optimization algorithms for linearly constrained l(1)-norm optimization.
引用
收藏
页码:6173 / 6185
页数:13
相关论文
共 50 条
  • [31] ASYMPTOTICS OF THE “MINIMUM L1-NORM”ESTIMATES IN A PARTLY LINEAR MODEL
    SHI Peide
    LI Guoying(Institute of Systems Science
    SystemsScienceandMathematicalSciences, 1994, (01) : 67 - 77
  • [32] An L∞/L1-constrained quadratic optimization problem with applications to neural networks
    Leizarowitz, A
    Rubinstein, J
    APPLIED MATHEMATICS AND OPTIMIZATION, 2004, 49 (01): : 55 - 80
  • [33] Grassmann Manifold Optimization for Fast L1-Norm Principal Component Analysis
    Minnehan, Breton
    Savakis, Andreas
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (02) : 242 - 246
  • [34] BALSON: BAYESIAN LEAST SQUARES OPTIMIZATION WITH NONNEGATIVE L1-NORM CONSTRAINT
    Xie, Jiyang
    Ma, Zhanyu
    Zhang, Guoqiang
    Xue, Jing-Hao
    Chien, Jen-Tzung
    Lin, Zhiqing
    Guo, Jun
    2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [35] Approximation and optimization of L1-norm for continuous-time linear systems
    Wang, Xiajiao
    Wu, Jun
    Xu, Weihua
    Chen, Sheng
    Wang, Xiaoliang
    WCICA 2006: SIXTH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-12, CONFERENCE PROCEEDINGS, 2006, : 2224 - 2228
  • [36] A Compact Cooperative Recurrent Neural Network for Computing General Constrained L1 Norm Estimators
    Xia, Youshen
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2009, 57 (09) : 3693 - 3697
  • [37] Model predictive control of electric drive system with L1-norm
    Feher, Marek
    Straka, Ondrej
    Smidl, Vaclav
    EUROPEAN JOURNAL OF CONTROL, 2020, 56 : 242 - 253
  • [38] Fast sparse representation model for l1-norm minimisation problem
    Peng, C. Y.
    Li, J. W.
    ELECTRONICS LETTERS, 2012, 48 (03) : 154 - U42
  • [39] Comparison of l∞-norm and l1-norm optimization criteria for SIR-balanced multi-user beamforming
    Schubert, M
    Boche, H
    SIGNAL PROCESSING, 2004, 84 (02) : 367 - 378
  • [40] Robust L1-norm two-dimensional linear discriminant analysis
    Li, Chun-Na
    Shao, Yuan-Hai
    Deng, Nai-Yang
    NEURAL NETWORKS, 2015, 65 : 92 - 104