On neural-network implementations of k-nearest neighbor pattern classifiers

被引:0
|
作者
Chen, YQ [1 ]
Damper, RI [1 ]
Nixon, MS [1 ]
机构
[1] UNIV SOUTHAMPTON,DEPT ELECT & COMP SCI,SOUTHAMPTON SO17 1BJ,HANTS,ENGLAND
关键词
Logic circuits; neural-network applications; neural-network hardware; pattern recognition;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The k-nearest neighbor (k-NN) decision rule is the basis of a well-established, high-performance pattern-recognition technique but its sequential implementation is inherently slow. More recently, feedforward neural networks trained on error backpropagation have been widely used to solve a variety of pattern-recognition problems. However, it is arguably unnecessary to learn such a computationally intensive solution when one (i.e., the k-NN rule) is effectively available a priori, especially given the well-known pitfalls of backpropagation. Accordingly, there is some interest in the literature in network implementations of this rule, so as to combine its known, good performance with the speed of a massively parallel realization. In this paper, we present a novel neural-network architecture which implements the L-NN rule and whose distinctive feature relative to earlier work is its synchronous (i.e., clocked) nature. Essentially, it has a layered, feedforward structure but, in its basic form, also incorporates feedback to control sequential selection of the k neighbors. The principal advantages of this new scheme are the avoidance of the stability problems which can arise with alternative asynchronous feedback (lateral-inhibition) circuits, the restriction of analog weights to the first hidden layer and the fact that network design uses noniterative weight calculations rather than iterative backpropagation. Analysis of the network shows that it will converge to the desired solution (faithfully classifying the input pattern according to the k-NN rule) within (2k - 1) clock cycles. Apart from minor changes which can be effected externally, the same design serves for any value of X. The space complexity of the ('brute-force'' network implementation is O(N-2) units, where N is the number of training patterns, and it has O(N(2)d) analog weights where d is the dimensionality of these patterns. Thus, some modifications to reduce the required number of units (and, thereby, weighted connections) are considered. Overall, this paper affords for high-speed, parallel implementations of proven pattern-classification techniques.
引用
收藏
页码:622 / 629
页数:8
相关论文
共 50 条
  • [21] Fuzzy Monotonic K-Nearest Neighbor Versus Monotonic Fuzzy K-Nearest Neighbor
    Zhu, Hong
    Wang, Xizhao
    Wang, Ran
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2022, 30 (09) : 3501 - 3513
  • [22] Comparison of two classifiers; K-nearest neighbor and artificial neural network, for fault diagnosis on a main engine journal-bearing
    Moosavian, A.
    Ahmadi, H.
    Tabatabaeefar, A.
    Khazaee, M.
    SHOCK AND VIBRATION, 2013, 20 (02) : 263 - 272
  • [23] RNA secondary structure prediction from sequence alignments using a network of k-nearest neighbor classifiers
    Bindewald, E
    Shapiro, BA
    RNA, 2006, 12 (03) : 342 - 352
  • [24] Comparative Analysis of K-Nearest Neighbor and Modified K-Nearest Neighbor Algorithm for Data Classification
    Okfalisa
    Mustakim
    Gazalba, Ikbal
    Reza, Nurul Gayatri Indah
    2017 2ND INTERNATIONAL CONFERENCES ON INFORMATION TECHNOLOGY, INFORMATION SYSTEMS AND ELECTRICAL ENGINEERING (ICITISEE): OPPORTUNITIES AND CHALLENGES ON BIG DATA FUTURE INNOVATION, 2017, : 294 - 298
  • [25] Visual classification of wood knots using k-nearest neighbor and convolutional neural network
    Kim H.
    Kim M.
    Park Y.
    Yang S.-Y.
    Chung H.
    Kwon O.
    Yeo H.
    Journal of the Korean Wood Science and Technology, 2019, 47 (02): : 229 - 238
  • [26] Prediction of heart conditions by consensus K-nearest neighbor algorithm and convolution neural network
    Waris, Saiyed Faiayaz
    Koteeswaran, S.
    INTERNATIONAL JOURNAL OF MODELING SIMULATION AND SCIENTIFIC COMPUTING, 2022, 13 (04)
  • [27] Three different classifiers for facial age estimation based on K-nearest neighbor
    Tharwat, Alaa
    Ghanem, Ahmed M.
    Hassanien, Aboul Ella
    2013 9TH INTERNATIONAL COMPUTER ENGINEERING CONFERENCE (ICENCO 2013): TODAY INFORMATION SOCIETY WHAT'S NEXT?, 2014, : 55 - 60
  • [28] Combining multiple k-nearest neighbor classifiers using different distance functions
    Bao, YG
    Ishii, N
    Du, XY
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING IDEAL 2004, PROCEEDINGS, 2004, 3177 : 634 - 641
  • [29] A new edited k-nearest neighbor rule in the pattern classification problem
    Hattori, K
    Takahashi, M
    PATTERN RECOGNITION, 2000, 33 (03) : 521 - 528
  • [30] k-Nearest Neighbour Classifiers - A Tutorial
    Cunningham, Padraig
    Delany, Sarah Jane
    ACM COMPUTING SURVEYS, 2021, 54 (06)