Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces

被引:0
|
作者
Fukumizu, K
Bach, FR
Jordan, MI
机构
[1] Inst Stat Math, Minato Ku, Tokyo 1068569, Japan
[2] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
[3] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
关键词
regression; dimensionality reduction; variable selection; feature selection; kernel methods; conditional independence;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a novel method of dimensionality reduction for supervised learning problems. Given a regression or classification problem in which we wish to predict a response variable Y from an explanatory variable X, we treat the problem of dimensionality reduction as that of finding a low-dimensional "effective subspace" for X which retains the statistical relationship between X and Y. We show that this problem can be formulated in terms of conditional independence. To turn this formulation into an optimization problem we establish a general nonparmetric characterization of conditional independence using covariance operators on reproducing kernel Hilbert spaces. This characterization allows us to derive a contrast function for estimation of the effective subspace. Unlike many conventional methods for dimensionality reduction in supervised learning, the proposed method requires neither assumptions on the marginal distribution of X, nor a parametric model of the conditional distribution of Y. We present experiments that compare the performance of the method with conventional methods.
引用
收藏
页码:73 / 99
页数:27
相关论文
共 50 条
  • [1] Kernel dimensionality reduction for supervised learning
    Fukumizu, K
    Bach, FR
    Jordan, MI
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 16, 2004, 16 : 81 - 88
  • [2] Reproducing kernel hilbert spaces
    Seddighi, K.
    Iranian Journal of Science & Technology, 1993, 17 (03):
  • [3] Learning Theory with Consensus in Reproducing Kernel Hilbert Spaces
    Deng, Zhaoda
    Gregory, Jessica
    Kurdila, Andrew
    2012 AMERICAN CONTROL CONFERENCE (ACC), 2012, : 1400 - 1405
  • [4] Adaptive supervised learning on data streams in reproducing kernel Hilbert spaces with data sparsity constraint
    Wang, Haodong
    Li, Quefeng
    Liu, Yufeng
    STAT, 2023, 12 (01):
  • [5] An efficient multiple kernel learning in reproducing kernel Hilbert spaces (RKHS)
    Xu, Lixiang
    Luo, Bin
    Tang, Yuanyan
    Ma, Xiaohua
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2015, 13 (02)
  • [6] Adaptive Learning in Cartesian Product of Reproducing Kernel Hilbert Spaces
    Yukawa, Masahiro
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (22) : 6037 - 6048
  • [7] Learning Partial Differential Equations in Reproducing Kernel Hilbert Spaces
    Stepaniants, George
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [8] Pasting Reproducing Kernel Hilbert Spaces
    Sawano, Yoshihiro
    NEW TRENDS IN ANALYSIS AND INTERDISCIPLINARY APPLICATIONS, 2017, : 401 - 407
  • [9] Noncommutative reproducing kernel Hilbert spaces
    Ball, Joseph A.
    Marx, Gregory
    Vinnikov, Victor
    JOURNAL OF FUNCTIONAL ANALYSIS, 2016, 271 (07) : 1844 - 1920
  • [10] On isomorphism of reproducing kernel Hilbert spaces
    V. V. Napalkov
    V. V. Napalkov
    Doklady Mathematics, 2017, 95 : 270 - 272