Random kernel k-nearest neighbors regression

被引:4
|
作者
Srisuradetchai, Patchanok [1 ]
Suksrikran, Korn [1 ]
机构
[1] Thammasat Univ, Dept Math & Stat, Pathum Thani, Thailand
来源
FRONTIERS IN BIG DATA | 2024年 / 7卷
关键词
bootstrapping; feature selection; k-nearest neighbors regression; kernel k-nearest neighbors; state-of-the-art (SOTA); MOLECULAR DESCRIPTORS; NEURAL-NETWORKS; ENSEMBLE; ALGORITHM; MODEL; SET;
D O I
10.3389/fdata.2024.1402384
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The k-nearest neighbors (KNN) regression method, known for its nonparametric nature, is highly valued for its simplicity and its effectiveness in handling complex structured data, particularly in big data contexts. However, this method is susceptible to overfitting and fit discontinuity, which present significant challenges. This paper introduces the random kernel k-nearest neighbors (RK-KNN) regression as a novel approach that is well-suited for big data applications. It integrates kernel smoothing with bootstrap sampling to enhance prediction accuracy and the robustness of the model. This method aggregates multiple predictions using random sampling from the training dataset and selects subsets of input variables for kernel KNN (K-KNN). A comprehensive evaluation of RK-KNN on 15 diverse datasets, employing various kernel functions including Gaussian and Epanechnikov, demonstrates its superior performance. When compared to standard KNN and the random KNN (R-KNN) models, it significantly reduces the root mean square error (RMSE) and mean absolute error, as well as improving R-squared values. The RK-KNN variant that employs a specific kernel function yielding the lowest RMSE will be benchmarked against state-of-the-art methods, including support vector regression, artificial neural networks, and random forests.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] k-Nearest neighbors local linear regression for functional and missing data at random
    Rachdi, Mustapha
    Laksaci, Ali
    Kaid, Zoulikha
    Benchiha, Abbassia
    Al-Awadhi, Fahimah A.
    STATISTICA NEERLANDICA, 2021, 75 (01) : 42 - 65
  • [2] Kernel difference-weighted k-nearest neighbors classification
    Zuo, Wangmeng
    Wang, Kuanquan
    Zhang, Hongzhi
    Zhang, David
    ADVANCED INTELLIGENT COMPUTING THEORIES AND APPLICATIONS, PROCEEDINGS: WITH ASPECTS OF ARTIFICIAL INTELLIGENCE, 2007, 4682 : 861 - 870
  • [3] K-Nearest Neighbors Hashing
    He, Xiangyu
    Wang, Peisong
    Cheng, Jian
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2834 - 2843
  • [4] Local interpretation of nonlinear regression model with k-nearest neighbors
    Kaneko, Hiromasa
    DIGITAL CHEMICAL ENGINEERING, 2023, 6
  • [5] k-Nearest Neighbors Estimator for Functional Asymmetry Shortfall Regression
    Alamari, Mohammed B.
    Almulhim, Fatimah A.
    Kaid, Zoulikha
    Laksaci, Ali
    SYMMETRY-BASEL, 2024, 16 (07):
  • [6] Prediction of Beef Production Using Linear Regression, Random Forest and k-Nearest Neighbors Algorithms
    Yildiz, Berkant Ismail
    Karabag, Kemal
    KSU TARIM VE DOGA DERGISI-KSU JOURNAL OF AGRICULTURE AND NATURE, 2025, 28 (01): : 247 - 255
  • [7] Modernizing k-nearest neighbors
    Elizabeth Yancey, Robin
    Xin, Bochao
    Matloff, Norm
    STAT, 2021, 10 (01):
  • [8] The Comparison of Linear Regression Method and K-Nearest Neighbors in Scholarship Recipient
    Okfalisa
    Fitriani, Ratika
    Vitriani, Yelfi
    2018 19TH IEEE/ACIS INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ARTIFICIAL INTELLIGENCE, NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING (SNPD), 2018, : 194 - 199
  • [9] Model selection for k-nearest neighbors regression using VC bounds
    Cherkassky, V
    Ma, YQ
    Tang, J
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 1143 - 1148
  • [10] METHOD FOR DETERMINING K-NEAREST NEIGHBORS
    KITTLER, J
    KYBERNETES, 1978, 7 (04) : 313 - 315