Dimensionality reduction using covariance operator inverse regression

被引:0
|
作者
Kim, Minyoung [1 ]
Pavlovic, Vladimir [1 ]
机构
[1] Rutgers State Univ, Dept Comp Sci, Piscataway, NJ 08854 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the task of dimensionality reduction for regression (DRR) whose goal is to find a low dimensional representation of input covariates, while preserving the statistical correlation with output targets. DRR is particularly suited for visualization of high dimensional data as well as the efficient regressor design with a reduced input dimension. In this paper we propose a novel nonlinear method for DRR that exploits the kernel Gram matrices of input and output. While most existing DRR techniques rely on the inverse regression, our approach removes the need for explicit slicing of the output space using covariance operators in RKHS. This unique property make DRR applicable to problem domains with high dimensional output data with potentially significant amounts of noise. Although recent kernel dimensionality reduction algorithms make use of RKHS covariance operators to quantify conditional dependency between the input and the targets via the dimension-reduced input, they are either limited to a transduction setting or linear input subspaces and restricted to non-closed-form solutions. In contrast, our approach provides a closed-form solution to the nonlinear basis functions on which any new input point can be easily projected. We demonstrate the benefits of the proposed method in a comprehensive set of evaluations on several important regression problems that arise in computer vision.
引用
收藏
页码:488 / 495
页数:8
相关论文
共 50 条
  • [41] A Regression-Based Interpretation of the Inverse of the Sample Covariance Matrix
    Kwan, Clarence C. Y.
    SPREADSHEETS IN EDUCATION, 2014, 7 (01):
  • [42] Supervised Dimensionality Reduction Methods via Recursive Regression
    Liu, Yun
    Zhang, Rui
    Nie, Feiping
    Li, Xuelong
    Ding, Chris
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (09) : 3269 - 3279
  • [43] Compressed Spectral Regression for Efficient Nonlinear Dimensionality Reduction
    Cai, Deng
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3359 - 3365
  • [45] Robust inverse regression for dimension reduction
    Dong, Yuexiao
    Yu, Zhou
    Zhu, Liping
    JOURNAL OF MULTIVARIATE ANALYSIS, 2015, 134 : 71 - 81
  • [46] Local Dimensionality Reduction for Non-Parametric Regression
    Heiko Hoffmann
    Stefan Schaal
    Sethu Vijayakumar
    Neural Processing Letters, 2009, 29
  • [47] Dimensionality reduction of multidimensional temporal data through regression
    Rangarajan, L
    Nagabhushan, P
    PATTERN RECOGNITION LETTERS, 2004, 25 (08) : 899 - 910
  • [48] Local Dimensionality Reduction for Non-Parametric Regression
    Hoffmann, Heiko
    Schaal, Stefan
    Vijayakumar, Sethu
    NEURAL PROCESSING LETTERS, 2009, 29 (02) : 109 - 131
  • [49] Neighborhood Structure Preserving Ridge Regression for Dimensionality Reduction
    Shu, Xin
    Lu, Hongtao
    PATTERN RECOGNITION, 2012, 321 : 25 - 32
  • [50] Robust Covariance Representations With Large Margin Dimensionality Reduction for Visual Classification
    Sun, Qiule
    Zhang, Jianxin
    Zhu, Pengfei
    Wang, Qilong
    Li, Peihua
    IEEE ACCESS, 2018, 6 : 5531 - 5537