Robust kernel ensemble regression in diversified kernel space with shared parameters

被引:0
|
作者
Zhi-feng Liu
Liu Chen
Sumet Mehta
Xiang-Jun Shen
Yu-bao Cui
机构
[1] JiangSu University,School of Computer Science and Communication Engineering
[2] JCDM College of Engineering,Department of Electronics and Communication Engineering
[3] The Affiliated Wuxi People’s Hospital of Nanjing Medical University,Clinical Research Center
来源
Applied Intelligence | 2023年 / 53卷
关键词
Kernel regression; Ensemble regression; Multiple kernels; Shared parameters;
D O I
暂无
中图分类号
学科分类号
摘要
Kernel regression is an effective non-parametric regression method. However, such regression methods have problems in choosing an appropriate kernel and its parameters. In this paper, we propose a robust kernel ensemble regression model (RKER) in diversified multiple Reproducing Kernel Hilbert Spaces (RKHSs). Motivated by multi-view data processing, we consider a kernel representation as one view of data and apply this multi-view modeling idea into the kernel regression scenario. The proposed RKER uses an ensemble idea to combine multiple individual regressors into one, where each kernel regressor is associated with a weight that is learned directly from one view of data without manual intervention. Thus, the problem of selecting kernel and its parameter in traditional kernel regression methods is overcome by finding best kernel combinations in diversified multiple solution spaces. With this multi-view modeling, RKER results in a superior overall regression performance and more robust in parameter selection. Further, we can learn the parameters in multiple RKHSs with individual specific and shared structures. Experimental results on Abalone and FaceBook datasets demonstrate that our proposed RKER model shows best performance among other state-of-the-art regression and ensemble methods, such as Random Forest, Gradient Boosting Regressor and eXtreme Gradient Boosting.
引用
收藏
页码:1051 / 1067
页数:16
相关论文
共 50 条
  • [1] Robust kernel ensemble regression in diversified kernel space with shared parameters
    Liu, Zhi-feng
    Chen, Liu
    Mehta, Sumet
    Shen, Xiang-Jun
    Cui, Yu-bao
    APPLIED INTELLIGENCE, 2023, 53 (01) : 1051 - 1067
  • [2] Least squares kernel ensemble regression in Reproducing Kernel Hilbert Space
    Shen, Xiang-Jun
    Dong, Yong
    Gou, Jian-Ping
    Zhan, Yong-Zhao
    Fan, Jianping
    NEUROCOMPUTING, 2018, 311 : 235 - 244
  • [3] Kernel ensemble support vector machine with integrated loss in shared parameters space
    Wu, YuRen
    Shen, Xiang-Jun
    Abhadiomhen, Stanley Ebhohimhen
    Yang, Yang
    Gu, Ji-Nan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (12) : 18077 - 18096
  • [4] Kernel ensemble support vector machine with integrated loss in shared parameters space
    YuRen Wu
    Xiang-Jun Shen
    Stanley Ebhohimhen Abhadiomhen
    Yang Yang
    Ji-Nan Gu
    Multimedia Tools and Applications, 2023, 82 : 18077 - 18096
  • [5] Robust Regularized Kernel Regression
    Zhu, Jianke
    Hoi, Steven C. H.
    Lyu, Michael Rung-Tsong
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2008, 38 (06): : 1639 - 1644
  • [6] Robust nonparametric kernel regression estimator
    Zhao, Ge
    Ma, Yanyuan
    STATISTICS & PROBABILITY LETTERS, 2016, 116 : 72 - 79
  • [7] A note on robust kernel inverse regression
    Dong, Yuexiao
    Yu, Zhou
    Sun, Yizhi
    STATISTICS AND ITS INTERFACE, 2013, 6 (01) : 45 - 52
  • [8] Co-regularized kernel ensemble regression
    Wornyo, Dickson Keddy
    Shen, Xiang-Jun
    Dong, Yong
    Wang, Liangjun
    Huang, Shu-Cheng
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2019, 22 (02): : 717 - 734
  • [9] Co-regularized kernel ensemble regression
    Dickson Keddy Wornyo
    Xiang-Jun Shen
    Yong Dong
    Liangjun Wang
    Shu-Cheng Huang
    World Wide Web, 2019, 22 : 717 - 734
  • [10] Kernel partial least squares regression in Reproducing Kernel Hilbert Space
    Rosipal, R
    Trejo, LJ
    JOURNAL OF MACHINE LEARNING RESEARCH, 2002, 2 (02) : 97 - 123