We outline a model of computing with high-dimensional (HD) vectors-where the dimensionality is in the thousands. It is built on ideas from traditional (symbolic) computing and artificial neural nets/deep learning, and complements them with ideas from probability theory, statistics, and abstract algebra. Key properties of HD computing include a well-defined set of arithmetic operations on vectors, generality, scalability, robustness, fast learning, and ubiquitous parallel operation, making it possible to develop efficient algorithms for large-scale real-world tasks. We present a 2-D architecture and demonstrate its functionality with examples from text analysis, pattern recognition, and biosignal processing, while achieving high levels of classification accuracy (close to or above conventional machine-learning methods), energy efficiency, and robustness with simple algorithms that learn fast. HD computing is ideally suited for 3-D nanometer circuit technology, vastly increasing circuit density and energy efficiency, and paving a way to systems capable of advanced cognitive tasks.