Latest YouTube Video

Thursday, September 8, 2016

Backpropagation of Hebbian plasticity for lifelong learning. (arXiv:1609.02228v1 [cs.NE])

Hebbian plasticity allows biological agents to learn from their lifetime experience, extending the fixed information provided by evolutionary search. Conversely, backpropagation methods can build high-performance fixed-weights networks, but are not currently equipped to design networks with Hebbian connections. Here we use backpropagation to train fully-differentiable plastic networks, such that backpropagation determines not only the baseline weights, but also the plasticity of each connection. To perform this backpropagation of Hebbian plasticity (BOHP), we derive error gradients for neural networks with Hebbian plastic connections. The equations for these gradients turn out to follow a simple, recursive form. We apply this method to train small networks for simple learning tasks inspired from classical conditioning. We show that, through Hebbian plasticity, the networks perform fast learning of unpredictable environmental features during their lifetime, successfully solving a task that fixed-weight feedforward networks cannot possibly solve. We conclude that backpropagation of Hebbian plasticity offers a powerful model for lifelong learning.



from cs.AI updates on arXiv.org http://ift.tt/2cxDnyC
via IFTTT

No comments: