On Convergence of the Iteratively Preconditioned Gradient-Descent (IPG) Observer

被引:0
|
作者
Chakrabarti, Kushal [1 ]
Chopra, Nikhil [2 ]
机构
[1] Tata Consultancy Serv Res, Div Data & Decis Sci, Mumbai 400607, India
[2] Univ Maryland Coll Pk, Dept Mech Engn, College Pk, MD 20742 USA
来源
关键词
Estimation; observers for nonlinear systems; optimization algorithms; TIME; SYSTEMS; DESIGN;
D O I
10.1109/LCSYS.2024.3416337
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This letter considers the observer design problem for discrete-time nonlinear dynamical systems with sampled measurements. The recently proposed Iteratively Preconditioned Gradient-Descent (IPG) observer, a Newton-type observer, has been empirically shown to have improved robustness against measurement noise than the prominent nonlinear observers, a property that other Newton-type observers lack. However, no theoretical guarantees on the convergence of the IPG observer were provided. This letter presents a rigorous convergence analysis of the IPG observer for a class of nonlinear systems in deterministic settings, proving its local linear convergence to the actual trajectory. The assumptions are standard in the existing literature of Newton-type observers, and the analysis further confirms the relation of IPG observer with Newton observer, which was only hypothesized earlier.
引用
收藏
页码:1715 / 1720
页数:6
相关论文
共 50 条
  • [41] Modeling and analysis of dielectric materials by using gradient-descent optimization method
    Alagoz B.B.
    Alisoy H.Z.
    Koseoglu M.
    Alagoz S.
    Alagoz, B.B. (baykant.alagoz@inonu.edu.tr), 1600, World Scientific (08):
  • [42] Fault-Tolerant Probabilistic Gradient-Descent Bit Flipping Decoder
    Al Rasheed, Omran
    Ivanis, Predrag
    Vasic, Bane
    IEEE COMMUNICATIONS LETTERS, 2014, 18 (09) : 1487 - 1490
  • [43] Phase-only pattern synthesis based on gradient-descent optimization
    Lu, Chengjun
    Sheng, Weixing
    Han, Yubing
    Ma, Xiaofeng
    JOURNAL OF SYSTEMS ENGINEERING AND ELECTRONICS, 2016, 27 (02) : 297 - 307
  • [44] Understanding the Unstable Convergence of Gradient Descent
    Ahn, Kwangjun
    Zhang, Jingzhao
    Sra, Suvrit
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 247 - 257
  • [45] Convergence of Stochastic Gradient Descent for PCA
    Shamir, Ohad
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [46] On the Unstable Convergence Regime of Gradient Descent
    Chen, Shuo
    Peng, Jiaying
    Li, Xiaolong
    Zhao, Yao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 11373 - 11380
  • [47] Convergence of Gradient Descent on Separable Data
    Nacson, Mor Shpigel
    Lee, Jason D.
    Gunasekar, Suriya
    Savarese, Pedro H. P.
    Srebro, Nathan
    Soudry, Daniel
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [48] A time-series modeling method based on the boosting gradient-descent theory
    Gao YunLong
    Pan JinYan
    Ji GuoLi
    Gao Feng
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2011, 54 (05) : 1325 - 1337
  • [49] Learned Preconditioned Conjugate Gradient Descent for Massive MIMO Detection
    Olutayo, Toluwaleke
    Champagne, Benoit
    2022 IEEE LATIN-AMERICAN CONFERENCE ON COMMUNICATIONS (LATINCOM), 2022,
  • [50] Stochastic Gradient Descent with Preconditioned Polyak Step-Size
    Abdukhakimov, F.
    Xiang, C.
    Kamzolov, D.
    Takac, M.
    COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 2024, 64 (04) : 621 - 634