A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. During learning, the weights of the network are adapted to the temporal statistics of its input, in a way that maximizes the information retained by the network. Sequences stored in the network may be retrieved in the reverse order of presentation, thus providing a straight-forward implementation of a logical stack.