Residual k-Nearest Neighbors Label Distribution Learning

被引:1
|
作者
Wang, Jing [1 ,2 ]
Feng, Fu [1 ,2 ]
Lv, Jianhui [3 ]
Geng, Xin
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing 210096, Peoples R China
[2] Southeast Univ, Key Lab New Generat Artificial Intelligence Techno, Minist Educ, Nanjing, Peoples R China
[3] Jinzhou Med Univ, Affiliated Hosp 1, Jinzhou 121012, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Label Distribution Learning (LDL); Label ambiguity; Generalization; Manifold; Neighborhood;
D O I
10.1016/j.patcog.2024.111006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label Distribution Learning (LDL) is a novel learning paradigm that assigns label distribution to each instance. It aims to learn the label distribution of training instances and predict unknown ones. Algorithm Adaptation (AA)-kNN is one of the most representative LDL baselines that adapts the kNN algorithm to LDL. Its generalization risk has been proven to approach zero if given infinite training data. Despite its theoretical advantage, AA-kNN generally performs poorly because real-world LDL problems only have finite or small training data. In this paper, we improve AA-kNN and propose a novel method called Residual k-Nearest Neighbors Label Distribution Learning (RkNN-LDL). First, RkNN-LDL introduces residual label distribution learning. Second, RkNN-LDL exploits the neighborhood structure of label distribution. In theoretical analysis, we prove that RkNN-LDL has a tighter generalization bound than AA-kNN. Besides, extensive experiments validate that RkNN-LDL beats several state-of-the-art LDL methods and statistically outperforms AA-kNN.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] General-purpose learning machine using K-nearest neighbors algorithm
    Hamraz, Seyed Hamid
    Feyzabadi, Seyed Shams
    ROBOCUP 2005: ROBOT SOCCER WORLD CUP IX, 2006, 4020 : 529 - 536
  • [42] Robust high performance reinforcement learning through weighted k-nearest neighbors
    Antonio Martin H, Jose
    de Lope, Javier
    Maravall, Dario
    NEUROCOMPUTING, 2011, 74 (08) : 1251 - 1259
  • [43] Nonlinearity Mitigation Using a Machine Learning Detector Based on k-Nearest Neighbors
    Wang, Danshi
    Zhang, Min
    Fu, Meixia
    Cai, Zhongle
    Li, Ze
    Han, Huanhuan
    Cui, Yue
    Luo, Bin
    IEEE PHOTONICS TECHNOLOGY LETTERS, 2016, 28 (19) : 2102 - 2105
  • [44] Enhancing k-Nearest Neighbors through Learning Transformation Functions by Genetic Programming
    Huang, Kuan-Chun
    Wen, Yu-Wei
    Ting, Chuan-Kang
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 1891 - 1897
  • [45] Electrical load forecasting: A deep learning approach based on K-nearest neighbors
    Dong, Yunxuan
    Ma, Xuejiao
    Fu, Tonglin
    APPLIED SOFT COMPUTING, 2021, 99
  • [46] Distributed architecture for k-nearest neighbors recommender systems
    Formoso, Vreixo
    Fernandez, Diego
    Cacheda, Fidel
    Carneiro, Victor
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2015, 18 (04): : 997 - 1017
  • [47] Compressed kNN: K-Nearest Neighbors with Data Compression
    Salvador-Meneses, Jaime
    Ruiz-Chavez, Zoila
    Garcia-Rodriguez, Jose
    ENTROPY, 2019, 21 (03)
  • [48] Human Sleep Scoring Based on K-Nearest Neighbors
    Qureshi, Shahnawaz
    Karrila, Seppo
    Vanichayobon, Sirirut
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2018, 26 (06) : 2802 - +
  • [49] Parallel Search of k-Nearest Neighbors with Synchronous Operations
    Sismanis, Nikos
    Pitsianis, Nikos
    Sun, Xiaobai
    2012 IEEE CONFERENCE ON HIGH PERFORMANCE EXTREME COMPUTING (HPEC), 2012,
  • [50] A hashing strategy for efficient k-nearest neighbors computation
    Vanco, M
    Brunnett, G
    Schreiber, T
    COMPUTER GRAPHICS INTERNATIONAL, PROCEEDINGS, 1999, : 120 - 128