Dimensionality reduction using covariance operator inverse regression

被引:0
|
作者
Kim, Minyoung [1 ]
Pavlovic, Vladimir [1 ]
机构
[1] Rutgers State Univ, Dept Comp Sci, Piscataway, NJ 08854 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the task of dimensionality reduction for regression (DRR) whose goal is to find a low dimensional representation of input covariates, while preserving the statistical correlation with output targets. DRR is particularly suited for visualization of high dimensional data as well as the efficient regressor design with a reduced input dimension. In this paper we propose a novel nonlinear method for DRR that exploits the kernel Gram matrices of input and output. While most existing DRR techniques rely on the inverse regression, our approach removes the need for explicit slicing of the output space using covariance operators in RKHS. This unique property make DRR applicable to problem domains with high dimensional output data with potentially significant amounts of noise. Although recent kernel dimensionality reduction algorithms make use of RKHS covariance operators to quantify conditional dependency between the input and the targets via the dimension-reduced input, they are either limited to a transduction setting or linear input subspaces and restricted to non-closed-form solutions. In contrast, our approach provides a closed-form solution to the nonlinear basis functions on which any new input point can be easily projected. We demonstrate the benefits of the proposed method in a comprehensive set of evaluations on several important regression problems that arise in computer vision.
引用
收藏
页码:488 / 495
页数:8
相关论文
共 50 条
  • [31] Dimensionality Reduction via Regression in Hyperspectral Imagery
    Laparra, Valero
    Malo, Jesus
    Camps-Valls, Gustau
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2015, 9 (06) : 1026 - 1036
  • [32] Inverse design of photonic nanostructures using dimensionality reduction: reducing the computational complexity
    Zandehshahvar, Mohammadreza
    Kiarashi, Yashar
    Chen, Michael
    Barton, Reid
    Adibi, Ali
    OPTICS LETTERS, 2021, 46 (11) : 2634 - 2637
  • [33] Unsupervised nearest neighbor regression for dimensionality reduction
    Oliver Kramer
    Soft Computing, 2015, 19 : 1647 - 1661
  • [34] Dimensionality reduction by feature clustering for regression problems
    Xu, Rong-Fang
    Lee, Shie-Jue
    INFORMATION SCIENCES, 2015, 299 : 42 - 57
  • [35] Robust dimension reduction using sliced inverse median regression
    Eliana Christou
    Statistical Papers, 2020, 61 : 1799 - 1818
  • [36] Robust dimension reduction using sliced inverse median regression
    Christou, Eliana
    STATISTICAL PAPERS, 2020, 61 (05) : 1799 - 1818
  • [37] Abundant Inverse Regression Using Sufficient Reduction and Its Applications
    Kim, Hyunwoo J.
    Smith, Brandon M.
    Adluru, Nagesh
    Dyer, Charles R.
    Johnson, Sterling C.
    Singh, Vikas
    COMPUTER VISION - ECCV 2016, PT III, 2016, 9907 : 570 - 584
  • [38] Transductive De-Noising and Dimensionality Reduction using Total Bregman Regression
    Acharyya, Sreangsu
    PROCEEDINGS OF THE SIXTH SIAM INTERNATIONAL CONFERENCE ON DATA MINING, 2006, : 514 - 518
  • [39] A topographic mapping for dimensionality reduction in inverse design problems
    Zhao, Ying
    Song, Chao
    Zhou, Zhu
    Chen, Yan
    Wang, Yutong
    Cai, Jinsheng
    PHYSICS OF FLUIDS, 2024, 36 (12)
  • [40] Sparse Solution to Inverse Problem of Nonlinear Dimensionality Reduction
    Li, Honggui
    Trocan, Maria
    MULTIMEDIA AND NETWORK INFORMATION SYSTEMS, 2019, 833 : 322 - 331