Random kernel k-nearest neighbors regression

被引:4
|
作者
Srisuradetchai, Patchanok [1 ]
Suksrikran, Korn [1 ]
机构
[1] Thammasat Univ, Dept Math & Stat, Pathum Thani, Thailand
来源
FRONTIERS IN BIG DATA | 2024年 / 7卷
关键词
bootstrapping; feature selection; k-nearest neighbors regression; kernel k-nearest neighbors; state-of-the-art (SOTA); MOLECULAR DESCRIPTORS; NEURAL-NETWORKS; ENSEMBLE; ALGORITHM; MODEL; SET;
D O I
10.3389/fdata.2024.1402384
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The k-nearest neighbors (KNN) regression method, known for its nonparametric nature, is highly valued for its simplicity and its effectiveness in handling complex structured data, particularly in big data contexts. However, this method is susceptible to overfitting and fit discontinuity, which present significant challenges. This paper introduces the random kernel k-nearest neighbors (RK-KNN) regression as a novel approach that is well-suited for big data applications. It integrates kernel smoothing with bootstrap sampling to enhance prediction accuracy and the robustness of the model. This method aggregates multiple predictions using random sampling from the training dataset and selects subsets of input variables for kernel KNN (K-KNN). A comprehensive evaluation of RK-KNN on 15 diverse datasets, employing various kernel functions including Gaussian and Epanechnikov, demonstrates its superior performance. When compared to standard KNN and the random KNN (R-KNN) models, it significantly reduces the root mean square error (RMSE) and mean absolute error, as well as improving R-squared values. The RK-KNN variant that employs a specific kernel function yielding the lowest RMSE will be benchmarked against state-of-the-art methods, including support vector regression, artificial neural networks, and random forests.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] K-Nearest Neighbors regression for the discrimination of gamma rays and neutrons in organic scintillators
    Durbin, Matthew
    Wonders, M. A.
    Flaska, Marek
    Lintereur, Azaree T.
    NUCLEAR INSTRUMENTS & METHODS IN PHYSICS RESEARCH SECTION A-ACCELERATORS SPECTROMETERS DETECTORS AND ASSOCIATED EQUIPMENT, 2021, 987
  • [22] Particles Contaminations Detection during Plasma Etching Process by using k-Nearest Neighbors and Fuzzy k-Nearest Neighbors
    Somari, Noratika Mohammad
    Abdullah, Mohd Firdaus
    Osman, Muhammad Khusairi
    Nazelan, Abdul Mu'iz
    Ahmad, Khairul Azman
    Appanan, Sooria Pragash Rao S.
    Hooi, Loh Kwang
    2016 6TH IEEE INTERNATIONAL CONFERENCE ON CONTROL SYSTEM, COMPUTING AND ENGINEERING (ICCSCE), 2016, : 512 - 516
  • [23] Introduction to machine learning: k-nearest neighbors
    Zhang, Zhongheng
    ANNALS OF TRANSLATIONAL MEDICINE, 2016, 4 (11)
  • [24] Distributionally Robust Weighted k-Nearest Neighbors
    Zhu, Shixiang
    Xie, Liyan
    Zhang, Minghe
    Gao, Rui
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [25] The research on an adaptive k-nearest neighbors classifier
    Yu, Xiao-Gao
    Yu, Xiao-Peng
    PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2006, : 1241 - 1246
  • [26] A NEW FUZZY K-NEAREST NEIGHBORS ALGORITHM
    Li, Chengjie
    Pei, Zheng
    Li, Bo
    Zhang, Zhen
    INTELLIGENT DECISION MAKING SYSTEMS, VOL. 2, 2010, : 246 - +
  • [27] PATCH CONFIDENCE K-NEAREST NEIGHBORS DENOISING
    Angelino, Cesario V.
    Debreuve, Eric
    Barlaud, Michel
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 1129 - 1132
  • [28] A FUZZY EXTENDED K-NEAREST NEIGHBORS RULE
    BEREAU, M
    DUBUISSON, B
    FUZZY SETS AND SYSTEMS, 1991, 44 (01) : 17 - 32
  • [29] Hypersphere anchor loss for K-Nearest neighbors
    Xiang Ye
    Zihang He
    Heng Wang
    Yong Li
    Applied Intelligence, 2023, 53 : 30319 - 30328
  • [30] A k-Nearest Neighbors Approach for COCOMO Calibration
    Le, Phu
    Vu Nguyen
    2017 4TH NAFOSTED CONFERENCE ON INFORMATION AND COMPUTER SCIENCE (NICS), 2017, : 219 - 224