On the statistical efficiency of the LMS family of adaptive algorithms

被引:0
|
作者
Widrow, B [1 ]
Kamenetsky, M [1 ]
机构
[1] Stanford Univ, Dept Elect Engn, ISL, Stanford, CA 94305 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Two gradient descent adaptive algorithms are compared, the LMS algorithm and the LMS/Newton algorithm. LMS is simple and practical, and is used in many applications worldwide. LMS/Newton is based on Newton's method and the LMS algorithm. LMS/Newton is optimal in the least squares sense. It maximizes the quality of its adaptive solution while minimizing the use of training data. No other linear least squares algorithm can give better performance. LMS is easily implemented, but LMS/Newton, although of great mathematical interest, cannot be implemented in most practical applications. Because of its optimality, LMS/Newton serves as a benchmark for all least squares adaptive algorithms. The performances of LMS and LMS/Newton are compared, and it is found that under many circumstances, both algorithms provide equal performance. For example, when both algorithms are tested with statistically nonstationary input signals, their average performances are equal. When adapting with stationary input signals and with random initial conditions, their respective learning times are on average equal. However, under worst-case initial conditions, the learning time of LMS can be much greater than that of LMS/Newton, and this is the principal disadvantage of the LMS algorithm. But the strong points of LMS are ease of implementation and optimal performance under important practical conditions. For these reasons, the LMS algorithm has enjoyed very widespread application.
引用
收藏
页码:2872 / 2880
页数:9
相关论文
共 50 条
  • [41] On the tracking performance of LMS and RLS algorithms in an adaptive MMSE CDMA receiver
    Lin, Pei
    Rapajic, Predrag
    Krusevac, Zarko
    6th Australian Communications Theory Workshop 2005, Proceedings, 2005, : 175 - 178
  • [42] A new class of gradient adaptive step-size LMS algorithms
    Ang, WP
    Farhang-Boroujeny, B
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2001, 49 (04) : 805 - 810
  • [43] ESTIMATE OF EFFICIENCY OF ADAPTIVE PREDICTION ALGORITHMS
    LAPA, VG
    ENGINEERING CYBERNETICS, 1970, 8 (06): : 1176 - &
  • [44] Effects of impulse noise at filter input on performance of adaptive filters using the LMS and signed regressor LMS algorithms
    Koike, Shin'ichi
    2006 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATIONS, VOLS 1 AND 2, 2006, : 729 - 732
  • [45] STATISTICAL-ANALYSIS OF ADAPTIVE FILTERING ALGORITHMS
    PERVACHEV, SV
    PEROV, AI
    RADIOTEKHNIKA I ELEKTRONIKA, 1985, 30 (05): : 987 - 993
  • [46] Adaptive statistical algorithms in network reliability analysis
    Levendovszky, J
    Jereb, L
    Elek, Z
    Vesztergombi, G
    PERFORMANCE EVALUATION, 2002, 48 (1-4) : 225 - 236
  • [47] ON THE CONVERGENCE BEHAVIOR OF THE LMS AND THE NORMALIZED LMS ALGORITHMS
    SLOCK, DTM
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1993, 41 (09) : 2811 - 2825
  • [48] Coherent LMS algorithms
    Liang, YC
    Chin, FPS
    IEEE COMMUNICATIONS LETTERS, 2000, 4 (03) : 92 - 94
  • [49] FEATURE LMS ALGORITHMS
    Diniz, Paulo S. R.
    Yazdanpanah, Hamed
    Lima, Markus V. S.
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 4144 - 4148
  • [50] Prediction in LMS-type adaptive algorithms for smoothly time varying environments
    Electrical and Computer Engineering, DepartmentIsfahan Univ. of Technol., Isfahan, Iran
    不详
    IEEE Trans Signal Process, 6 (1735-1739):