Robust kernel ensemble regression in diversified kernel space with shared parameters

被引:0
|
作者
Zhi-feng Liu
Liu Chen
Sumet Mehta
Xiang-Jun Shen
Yu-bao Cui
机构
[1] JiangSu University,School of Computer Science and Communication Engineering
[2] JCDM College of Engineering,Department of Electronics and Communication Engineering
[3] The Affiliated Wuxi People’s Hospital of Nanjing Medical University,Clinical Research Center
来源
Applied Intelligence | 2023年 / 53卷
关键词
Kernel regression; Ensemble regression; Multiple kernels; Shared parameters;
D O I
暂无
中图分类号
学科分类号
摘要
Kernel regression is an effective non-parametric regression method. However, such regression methods have problems in choosing an appropriate kernel and its parameters. In this paper, we propose a robust kernel ensemble regression model (RKER) in diversified multiple Reproducing Kernel Hilbert Spaces (RKHSs). Motivated by multi-view data processing, we consider a kernel representation as one view of data and apply this multi-view modeling idea into the kernel regression scenario. The proposed RKER uses an ensemble idea to combine multiple individual regressors into one, where each kernel regressor is associated with a weight that is learned directly from one view of data without manual intervention. Thus, the problem of selecting kernel and its parameter in traditional kernel regression methods is overcome by finding best kernel combinations in diversified multiple solution spaces. With this multi-view modeling, RKER results in a superior overall regression performance and more robust in parameter selection. Further, we can learn the parameters in multiple RKHSs with individual specific and shared structures. Experimental results on Abalone and FaceBook datasets demonstrate that our proposed RKER model shows best performance among other state-of-the-art regression and ensemble methods, such as Random Forest, Gradient Boosting Regressor and eXtreme Gradient Boosting.
引用
收藏
页码:1051 / 1067
页数:16
相关论文
共 50 条
  • [21] Kernel PLS Regression II: Kernel Partial Least Squares Regression by Projecting Both Independent and Dependent Variables into Reproducing Kernel Hilbert Space
    Pei, Yan
    2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2018, : 2031 - 2036
  • [22] Ensemble forecasts in reproducing kernel Hilbert space family
    Dufee, Benjamin
    Hug, Berenger
    Memin, Etienne
    Tissot, Gilles
    PHYSICA D-NONLINEAR PHENOMENA, 2024, 459
  • [23] Gradient descent for robust kernel-based regression
    Guo, Zheng-Chu
    Hu, Ting
    Shi, Lei
    INVERSE PROBLEMS, 2018, 34 (06)
  • [24] Robust One-Class Kernel Spectral Regression
    Arashloo, Shervin Rahimzadeh
    Kittler, Josef
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (03) : 999 - 1013
  • [25] ON ROBUST KERNEL ESTIMATION OF DERIVATIVES OF REGRESSION-FUNCTIONS
    HARDLE, W
    GASSER, T
    SCANDINAVIAN JOURNAL OF STATISTICS, 1985, 12 (03) : 233 - 240
  • [26] Kernel truncated regression representation for robust subspace clustering
    Zhen, Liangli
    Peng, Dezhong
    Wang, Wei
    Yao, Xin
    INFORMATION SCIENCES, 2020, 524 : 59 - 76
  • [27] An Ensemble of Kernel Ridge Regression for Multi-class Classification
    Rakesh, Katuwal
    Suganthan, P. N.
    INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS 2017), 2017, 108 : 375 - 383
  • [28] Fast quantile regression in reproducing kernel Hilbert space
    Zheng, Songfeng
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2022, 51 (02) : 568 - 588
  • [29] Fast quantile regression in reproducing kernel Hilbert space
    Songfeng Zheng
    Journal of the Korean Statistical Society, 2022, 51 : 568 - 588
  • [30] KERNEL REGRESSION ON GRAPHS IN RANDOM FOURIER FEATURES SPACE
    Elias, Vitor R. M.
    Gogineni, Vinay C.
    Martins, Wallace A.
    Werner, Stefan
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5235 - 5239