Recursive PCA and the structure of time series.

被引:0
|
作者
Voegtlin, T [1 ]
机构
[1] Humboldt Univ, Inst Theoret Biol, Berlin, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. During learning, the weights of the network are adapted to the temporal statistics of its input, in a way that maximizes the information retained by the network. Sequences stored in the network may be retrieved in the reverse order of presentation, thus providing a straight-forward implementation of a logical stack.
引用
收藏
页码:1893 / 1897
页数:5
相关论文
共 50 条