Dimensionality reduction using covariance operator inverse regression

被引:0
|
作者
Kim, Minyoung [1 ]
Pavlovic, Vladimir [1 ]
机构
[1] Rutgers State Univ, Dept Comp Sci, Piscataway, NJ 08854 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the task of dimensionality reduction for regression (DRR) whose goal is to find a low dimensional representation of input covariates, while preserving the statistical correlation with output targets. DRR is particularly suited for visualization of high dimensional data as well as the efficient regressor design with a reduced input dimension. In this paper we propose a novel nonlinear method for DRR that exploits the kernel Gram matrices of input and output. While most existing DRR techniques rely on the inverse regression, our approach removes the need for explicit slicing of the output space using covariance operators in RKHS. This unique property make DRR applicable to problem domains with high dimensional output data with potentially significant amounts of noise. Although recent kernel dimensionality reduction algorithms make use of RKHS covariance operators to quantify conditional dependency between the input and the targets via the dimension-reduced input, they are either limited to a transduction setting or linear input subspaces and restricted to non-closed-form solutions. In contrast, our approach provides a closed-form solution to the nonlinear basis functions on which any new input point can be easily projected. We demonstrate the benefits of the proposed method in a comprehensive set of evaluations on several important regression problems that arise in computer vision.
引用
收藏
页码:488 / 495
页数:8
相关论文
共 50 条
  • [1] Using sliced mean variance–covariance inverse regression for classification and dimension reduction
    Charles D. Lindsey
    Simon J. Sheather
    Joseph W. McKean
    Computational Statistics, 2014, 29 : 769 - 798
  • [2] Dimensionality Reduction using Symbolic Regression
    Icke, Ilknur
    Rosenberg, Andrew
    GECCO-2010 COMPANION PUBLICATION: PROCEEDINGS OF THE 12TH ANNUAL GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2010, : 2085 - 2086
  • [3] Using sliced mean variance-covariance inverse regression for classification and dimension reduction
    Lindsey, Charles D.
    Sheather, Simon J.
    McKean, Joseph W.
    COMPUTATIONAL STATISTICS, 2014, 29 (3-4) : 769 - 798
  • [4] Central Subspace Dimensionality Reduction Using Covariance Operators
    Kim, Minyoung
    Pavlovic, Vladimir
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (04) : 657 - 670
  • [5] Visualization of Regression Models Using Discriminative Dimensionality Reduction
    Schulz, Alexander
    Hammer, Barbara
    COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2015, PT II, 2015, 9257 : 437 - 449
  • [7] Dimensionality reduction by unsupervised regression
    Carreira-Perpinan, Miguel A.
    Lu, Zhengdong
    2008 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-12, 2008, : 2523 - +
  • [8] Dimensionality Reduction for Tukey Regression
    Clarkson, Kenneth L.
    Wang, Ruosong
    Woodruff, David P.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [9] Adaptive slope reliability analysis method based on sliced inverse regression dimensionality reduction
    Zhou, Zheng
    Xiong, Hai-Bin
    Wu, Wen-Xia
    Yang, Yi-Jian
    Yang, Xu-Hai
    FRONTIERS IN ECOLOGY AND EVOLUTION, 2023, 11
  • [10] Discriminative Dimensionality Reduction for Regression Problems using the Fisher Metric
    Schulz, Alexander
    Hammer, Barbara
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,