Latest YouTube Video

Thursday, December 17, 2015

Continuous online sequence learning with an unsupervised neural network model. (arXiv:1512.05463v1 [cs.NE])

The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, a recent study proposed hierarchical temporal memory (HTM) sequence memory as a theoretical framework for sequence learning in the cortex. In this paper, we analyze properties of HTM sequence memory and apply it to various sequence learning and prediction problems. We show the model is able to continuously learn a large number of variable-order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory and other sequence learning algorithms, including the autoregressive integrated moving average (ARIMA) model and long short-term memory (LSTM), on sequence prediction problems with both artificial and real-world data. The HTM model not only achieves comparable or better accuracy than state-of-the-art algorithms, but also exhibits a set of properties that is critical for sequence learning. These properties include continuous online learning, the ability to handle multiple predictions and branching sequences, robustness to sensor noise and fault tolerance, and good performance without task-specific hyper-parameters tuning. Therefore the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem, but is also applicable to a wide range of real-world problems such as discrete and continuous sequence prediction, anomaly detection, and sequence classification.

Donate to arXiv



from cs.AI updates on arXiv.org http://ift.tt/1T4UGVT
via IFTTT

No comments: