Local Dimensionality Reduction for Non-Parametric Regression

被引:0
|
作者
Heiko Hoffmann
Stefan Schaal
Sethu Vijayakumar
机构
[1] University of Edinburgh,IPAB, School of Informatics
[2] University of Southern California,Biomedical Engineering
[3] University of Southern California,Computer Science and Neuroscience
来源
Neural Processing Letters | 2009年 / 29卷
关键词
Correlation; Dimensionality reduction; Factor analysis; Incremental learning; Kernel function; Locally-weighted regression; Partial least squares; Principal component analysis; Principal component regression; Reduced-rank regression;
D O I
暂无
中图分类号
学科分类号
摘要
Locally-weighted regression is a computationally-efficient technique for non-linear regression. However, for high-dimensional data, this technique becomes numerically brittle and computationally too expensive if many local models need to be maintained simultaneously. Thus, local linear dimensionality reduction combined with locally-weighted regression seems to be a promising solution. In this context, we review linear dimensionality-reduction methods, compare their performance on non-parametric locally-linear regression, and discuss their ability to extend to incremental learning. The considered methods belong to the following three groups: (1) reducing dimensionality only on the input data, (2) modeling the joint input-output data distribution, and (3) optimizing the correlation between projection directions and output data. Group 1 contains principal component regression (PCR); group 2 contains principal component analysis (PCA) in joint input and output space, factor analysis, and probabilistic PCA; and group 3 contains reduced rank regression (RRR) and partial least squares (PLS) regression. Among the tested methods, only group 3 managed to achieve robust performance even for a non-optimal number of components (factors or projection directions). In contrast, group 1 and 2 failed for fewer components since these methods rely on the correct estimate of the true intrinsic dimensionality. In group 3, PLS is the only method for which a computationally-efficient incremental implementation exists. Thus, PLS appears to be ideally suited as a building block for a locally-weighted regressor in which projection directions are incrementally added on the fly.
引用
收藏
相关论文
共 50 条
  • [11] Non-Parametric Regression and Riesz Estimators
    Kountzakis, Christos
    Tsachouridou-Papadatou, Vasileia
    AXIOMS, 2023, 12 (04)
  • [12] Non-parametric Regression for Circular Responses
    Di Marzio, Marco
    Panzera, Agnese
    Taylor, Charles C.
    SCANDINAVIAN JOURNAL OF STATISTICS, 2013, 40 (02) : 238 - 255
  • [13] A NOTE ON NON-PARAMETRIC CENSORED REGRESSION
    MCLEISH, DL
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 1983, 18 (01) : 1 - 6
  • [14] NON-PARAMETRIC ESTIMATION OF A REGRESSION FUNCION
    SCHUSTER, EF
    ANNALS OF MATHEMATICAL STATISTICS, 1968, 39 (02): : 695 - +
  • [15] Parametrically guided non-parametric regression
    Glad, IK
    SCANDINAVIAN JOURNAL OF STATISTICS, 1998, 25 (04) : 649 - 668
  • [16] Non-parametric regression with wavelet kernels
    Rakotomamonjy, A
    Mary, X
    Canu, S
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2005, 21 (02) : 153 - 163
  • [17] Entropy criterion for surrogate timeseries data generation via non-parametric dimensionality reduction
    Lewis, Tyler
    Sundaram, Arvind
    Abdel-Khalik, Hany S.
    Rabiti, Cristian
    Talbot, Paul
    ANNALS OF NUCLEAR ENERGY, 2023, 180
  • [18] Non-parametric regression for compositional data
    Di Marzio, Marco
    Panzera, Agnese
    Venieri, Catia
    STATISTICAL MODELLING, 2015, 15 (02) : 113 - 133
  • [19] Testing for additivity in non-parametric regression
    Zhang, Yichi
    Staicu, Ana-Maria
    Maity, Arnab
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2016, 44 (04): : 445 - 462
  • [20] Imitation Learning with Non-Parametric Regression
    Vaandrager, Maarten
    Babuska, Robert
    Busoniu, Lucian
    Lopes, Gabriel A. D.
    2012 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION, QUALITY AND TESTING, ROBOTICS, THETA 18TH EDITION, 2012, : 91 - 96