The Refinement of Models With the Aid of the Fuzzy k-Nearest Neighbors Approach

被引:11
|
作者
Roh, Seok-Beom [1 ]
Ahn, Tae-Chon [1 ]
Pedrycz, Witold [2 ,3 ]
机构
[1] Wonkwang Univ, Dept Elect Elect & Informat Engn, Iksan 570749, South Korea
[2] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
[3] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB T6G 2G7, Canada
关键词
Fuzzy k-nearest neighbors (kNN); global model; incremental model; local model; model refinement; CLASSIFICATION; REGRESSION;
D O I
10.1109/TIM.2009.2025070
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we propose a new design methodology that supports the development of hybrid incremental models. These models result through an iterative process in which a parametric model and a nonparametric model are combined so that their underlying and complementary functionalities become fully exploited. The parametric component of the hybrid model captures some global relationships between the input variables and the output variable. The nonparametric model focuses on capturing local input-output relationships and thus augments the behavior of the model being formed at the global level. In the underlying design, we consider linear and quadratic regression to be a parametric model, whereas a fuzzy k-nearest neighbors model serves as the nonparametric counterpart of the overall model. Numeric results come from experiments that were carried out on some low-dimensional synthetic data sets and several machine learning data sets from the University of California-Irvine Machine Learning Repository.
引用
收藏
页码:604 / 615
页数:12
相关论文
共 50 条
  • [21] Landmine detection with ground penetrating radar using fuzzy K-nearest neighbors
    Frigui, H
    Gader, P
    Satyanarayana, K
    2004 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-3, PROCEEDINGS, 2004, : 1745 - 1749
  • [22] Introduction to machine learning: k-nearest neighbors
    Zhang, Zhongheng
    ANNALS OF TRANSLATIONAL MEDICINE, 2016, 4 (11)
  • [23] Distributionally Robust Weighted k-Nearest Neighbors
    Zhu, Shixiang
    Xie, Liyan
    Zhang, Minghe
    Gao, Rui
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [24] The research on an adaptive k-nearest neighbors classifier
    Yu, Xiao-Gao
    Yu, Xiao-Peng
    PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2006, : 1241 - 1246
  • [25] Random kernel k-nearest neighbors regression
    Srisuradetchai, Patchanok
    Suksrikran, Korn
    FRONTIERS IN BIG DATA, 2024, 7
  • [26] PATCH CONFIDENCE K-NEAREST NEIGHBORS DENOISING
    Angelino, Cesario V.
    Debreuve, Eric
    Barlaud, Michel
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 1129 - 1132
  • [27] Hypersphere anchor loss for K-Nearest neighbors
    Xiang Ye
    Zihang He
    Heng Wang
    Yong Li
    Applied Intelligence, 2023, 53 : 30319 - 30328
  • [28] AutoML for Stream k-Nearest Neighbors Classification
    Bahri, Maroua
    Veloso, Bruno
    Bifet, Albert
    Gama, Joao
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 597 - 602
  • [29] Maximizing Reverse k-Nearest Neighbors for Trajectories
    Al Rahat, Tamjid
    Arman, Arif
    Ali, Mohammed Eunus
    DATABASES THEORY AND APPLICATIONS, ADC 2018, 2018, 10837 : 262 - 274
  • [30] The research on an adaptive k-nearest neighbors classifier
    Yu, Xiaopeng
    Yu, Xiaogao
    PROCEEDINGS OF THE FIFTH IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS, VOLS 1 AND 2, 2006, : 535 - 540