Latest YouTube Video

Sunday, December 18, 2016

Event-driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines. (arXiv:1612.05596v1 [cs.NE])

An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. The gradient descent backpropagation rule is a powerful algorithm that is ubiquitous in deep learning, but it relies on the immediate availability of network-wide information stored with high-precision memory. However, recent work shows that exact backpropagated weights are not essential for learning deep representations. Random backpropagation replaces feedback weights with random ones and encourages the network to adjust its feed-forward weights to learn pseudo-inverses of the (random) feedback weights. Here, we demonstrate an event-driven random backpropagation (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations in neuromorphic computing hardware. The rule is very suitable for implementation in neuromorphic hardware using a two-compartment leaky integrate & fire neuron and a membrane-voltage modulated, spike-driven plasticity rule. Our results show that using eRBP, deep representations are rapidly learned without using backpropagated gradients, achieving nearly identical classification accuracies compared to artificial neural network simulations on GPUs, while being robust to neural and synaptic state quantizations during learning.



from cs.AI updates on arXiv.org http://ift.tt/2hydGme
via IFTTT

No comments: