Efficient Learning of Transform-Domain LMS Filter Using Graph Laplacian

被引:3
|
作者
Batabyal, Tamal [1 ]
Weller, Daniel [2 ,3 ]
Kapur, Jaideep [1 ]
Acton, Scott T. [2 ]
机构
[1] Univ Virginia, Dept Neurol, Charlottesville, VA 22904 USA
[2] Univ Virginia, Dept Elect & Comp Engn, Charlottesville, VA 22904 USA
[3] KLA Corp, Ann Arbor, MI 48105 USA
关键词
Convergence; Autocorrelation; Mathematical models; Transforms; Neurons; Linear systems; Discrete cosine transforms; Graph Laplacian; graph learning; Hebb-least mean squares (LMS) learning; LMS filter; split preconditioner; unitary transform; ADAPTIVE FILTERS; ALGORITHMS;
D O I
10.1109/TNNLS.2022.3144637
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transform-domain least mean squares (TDLMS) adaptive filters encompass the class of learning algorithms where the input data are subjected to a data-independent unitary transform followed by a power normalization stage as preprocessing steps. Because conventional transformations are not data-dependent, this preconditioning procedure was shown theoretically to improve the convergence of the least mean squares (LMS) filter only for certain classes of input data. So, one can tailor the transformation to the class of data. However, in reality, if the class of input data is not known beforehand, it is difficult to decide which transformation to use. Thus, there is a need to devise a learning framework to obtain such a preconditioning transformation using input data prior to applying on the input data. It is hypothesized that the underlying topology of the data affects the selection of the transformation. With the input modeled as a weighted finite graph, our method, called preconditioning using graph (PrecoG), adaptively learns the desired transform by recursive estimation of the graph Laplacian matrix. We show the efficacy of the transform as a generalized split preconditioner on a linear system of equations and in Hebbian-LMS learning models. In terms of the improvement of the condition number after applying the transformation, PrecoG performs significantly better than the existing state-of-the-art techniques that involve unitary and nonunitary transforms.
引用
收藏
页码:7608 / 7620
页数:13
相关论文
共 50 条
  • [41] Spatial Smoothing Using Graph Laplacian Penalized Filter
    Yamada, Hiroshi
    SPATIAL STATISTICS, 2024, 60
  • [42] Speed Up of Volumetric Non-Local Transform-Domain Filter Utilising HPC Architecture
    Strakos, Petr
    Jaros, Milan
    Riha, Lubomir
    Kozubek, Tomas
    JOURNAL OF IMAGING, 2023, 9 (11)
  • [43] Examining brain maturation during adolescence using graph Laplacian learning based Fourier transform
    Wang, Junqi
    Xiao, Li
    Wilson, Tony W.
    Stephen, Julia M.
    Calhoun, Vince D.
    Wang, Yu-Ping
    JOURNAL OF NEUROSCIENCE METHODS, 2020, 338
  • [44] Computation of Graph Fourier Transform Centrality Using Graph Filter
    Tseng, Chien-Cheng
    Lee, Su-Ling
    IEEE OPEN JOURNAL OF CIRCUITS AND SYSTEMS, 2024, 5 : 69 - 80
  • [45] Interference suppression using transform domain LMS adaptive filtering
    Medley, MJ
    Saulnier, GJ
    Das, PK
    1996 IEEE DIGITAL SIGNAL PROCESSING WORKSHOP, PROCEEDINGS, 1996, : 117 - 120
  • [46] Nonlinear blind equalization using transform-domain CMA and split FIR filtering
    Jung, JH
    Kim, BS
    Nam, SW
    Proceedings of the 23rd IASTED International Conference on Modelling, Identification, and Control, 2004, : 207 - 211
  • [47] On the convergence of the frequency-domain lms adaptive filter using LMS-DFT
    Ogunfunmi, Tokunbo
    Paul, Thomas
    2006 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS DESIGN AND IMPLEMENTATION, 2006, : 313 - 320
  • [48] Super-resolution reconstruction of compressed video using transform-domain statistics
    Gunturk, BK
    Altunbasak, Y
    Mersereau, RM
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2004, 13 (01) : 33 - 43
  • [49] A Secure and Robust Audio Watermarking Scheme Using Secret Sharing in the Transform-Domain
    Abbasi, Aliya Tabassum
    Miao, Fuyou
    Islam, Md Shohidul
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2025, 44 (02) : 1274 - 1307
  • [50] Block artifact reduction using a transform-domain Markov random field model
    Li, Z
    Delp, EJ
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2005, 15 (12) : 1583 - 1593