Long Term Recurrent Neural Network with State-Frequency Memory

被引:0
|
作者
Zhuang L. [1 ]
Lü Y. [1 ]
Yang J. [2 ,3 ]
Li H. [1 ]
机构
[1] School of Information Science and Technology, University of Science and Technology of China, Hefei
[2] Institute of System Engineering, Academy of Military Science, Beijing
[3] Peng Cheng Laboratory, Shenzhen, 518000, Guangdong
基金
中国国家自然科学基金;
关键词
Frequency domain analysis; Long-term dependency; Recurrent neural network (RNN); Time series classification; Time series prediction;
D O I
10.7544/issn1000-1239.2019.20180474
中图分类号
学科分类号
摘要
Modeling time series has become one of the research hotspots in the field of machine learning because of its important application value. Recurrent neural network (RNN) is a crucial tool for modeling time series in recent years. However, existing RNNs are commonly hard to learn long-term dependency in the temporal domain and unable to model the frequency patterns in time series. The two problems seriously limit the performance of existing RNNs for the time series that contain long-term dependencies and rich frequency components. To solve these problems, we propose the long term recurrent neural network with state-frequency memory (LTRNN-SFM), which allows the network to model the uncovered features in both frequency and temporal domains by replacing state vector of the hidden layer in conventional RNNs to state-frequency matrix. Meanwhile, the proposed network can effectively avoid the interference of the gradient vanishing and exploding problems by separating neurons in the same layer, using activation functions such as rectified linear unit (ReLU) and clipping weight. In this way, a LTRNN-SFM with long-term memory and multiple layers can be trained easily. Experimental results have demonstrated that the proposed network achieves the best performance in processing time series with long-term dependencies and rich frequency components. © 2019, Science Press. All right reserved.
引用
收藏
页码:2641 / 2648
页数:7
相关论文
共 13 条
  • [1] Yuan J., Wang Z., Review of time series representation and classification techniques, Computer Science, 42, 3, pp. 1-7, (2015)
  • [2] Elman J., Finding structure in time, Cognitive Science, 14, 2, pp. 179-211, (1990)
  • [3] Hochreiter S., Schmidhuber J., Long short-term memory, Neural Computation, 9, 8, pp. 1735-1780, (1997)
  • [4] Gers F., Schmidhuber J., Recurrent nets that time and count, Proc of the 13th IEEE-INNS-ENNS Int Joint Conf on Neural Networks (IJCNN), 3, pp. 189-194, (2000)
  • [5] Cho K., Merrienboer B., Gulcehre C., Et al., Learning phrase representations using RNN encoder-decoderfor statistical machine translation, Proc of the 19th Conf on Empirical Methods in Natural Language Processing, pp. 1724-1734, (2014)
  • [6] Koutnik J., Greff K., Gomez F., Et al., A clockwork RNN, Proc of the 31st Int Conf on Machine Learning, pp. 1863-1871, (2014)
  • [7] Li S., Li W., Chris C., Et al., Independently recurrent neural network (IndRNN): Building a longer and deeper RNN, Proc of the 31st IEEE Conf on Computer Vision and Pattern Recognition, pp. 5457-5466, (2018)
  • [8] Hu H., Qi G., State-frequency memory recurrent neural networks, Proc of the 34th Int Conf on Machine Learning, pp. 1568-1577, (2017)
  • [9] Zhang L., Charu A., Qi G., Stock price prediction via discovering multi-frequency trading patterns, Proc of the 23rd ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, pp. 2141-2149, (2017)
  • [10] Dua D., Karra T., UCI machine learning repository, (2017)