Incremental clustering algorithm of neural network

被引:1
|
作者
Liu P. [1 ,2 ]
Tang J. [1 ]
Xie S. [1 ]
Wang T. [1 ]
机构
[1] College of Computer, National University of Defense Technology, Changsha
[2] Teaching and Research Section of Information Resource Management, Department of Information Construction, Academy of National Defense Information, Wuhan
来源
Guofang Keji Daxue Xuebao | / 5卷 / 137-142期
关键词
Clustering algorithm; Incremental learning; Neural network; Time expense;
D O I
10.11887/j.cn.201605021
中图分类号
学科分类号
摘要
Neural network model is powerful in problem modelling. But the traditional back propagating algorithm can only execute batch supervised learning, and its time expense is very high. According to these problems, a novel incremental neural network model and the corresponding clustering algorithm were put forward. This model was supported by biological evidences, and it was built on the foundation of novel neuron's activation function and the synapse adjusting function. Based on this, an adaptive incremental clustering algorithm was put forward, in which mechanisms such as “winner-take-all” were introduced. As a result, “catastrophic forgetting” problem was successfully solved in the incremental clustering process. Experiment results on classic datasets show that this algorithm's performance is comparable with traditional clustering models such as K-means. Especially, its time and space expenses on incremental tasks are much lower than traditional clustering models. © 2016, NUDT Press. All right reserved.
引用
收藏
页码:137 / 142
页数:5
相关论文
共 17 条
  • [1] Xie S.X., Wang T., Construction of unsupervised sentiment classifier on idioms resources, Journal of Central South University, 21, 4, pp. 1376-1384, (2014)
  • [2] Wang X., Jia Y., Chen R., Et al., Improving text categorization with semantic knowledge in Wikipedia, IEICE Transection on Information System, 96, 12, pp. 2786-2794, (2013)
  • [3] Zhang Z., Liao Y., Yu Y., Application of BP neural network model in prediction of polar motion, Journal of National University of Defense Technology, 37, 2, pp. 156-160, (2015)
  • [4] Grossberg S., Adaptive resonance theory: how a brain learns to consciously attend, learn, and recognize a changing world, Neural Networks, 37, pp. 1-47, (2012)
  • [5] Rumelhart D.E., McClelland J.L., Parallel Distributed Processing: Exploration in the Microstructure of Cognition, (1988)
  • [6] Hinton G.E., Osindero S., Teh Y.W., A fast learning algorithm for deep belief nets, Neural Computation, 18, pp. 1527-1554, (2006)
  • [7] Warren S.S., Why statisticians should not FART
  • [8] Liu P.L., Tang J.T., Wang T., Information current in Twitter: which brings hot events to the world, Proceedings of 22nd International World Wide Web, pp. 111-112, (2013)
  • [9] Hodgkin A.L., Huxley A.F., A quantitative description of membrane current and its application to conduction and excitation in nerve, Bulletin of Mathematical Biology, 52, 1-2, pp. 25-71, (1990)
  • [10] Liu K., Hou Z., The periodic solution of a neural networks of two neurons with McCulloch-Pitts nonlinearity, Journal of National University of Defense Technology, 30, 4, pp. 129-132, (2008)