Latest YouTube Video

Thursday, October 20, 2016

A Growing Long-term Episodic & Semantic Memory. (arXiv:1610.06402v1 [cs.AI])

The long-term memory of most connectionist systems lies entirely in the weights of the system. Since the number of weights is typically fixed, this bounds the total amount of knowledge that can be learned and stored. Though this is not normally a problem for a neural network designed for a specific task, such a bound is undesirable for a system that continually learns over an open range of domains. To address this, we describe a lifelong learning system that leverages a fast, though non-differentiable, content-addressable memory which can be exploited to encode both a long history of sequential episodic knowledge and semantic knowledge over many episodes for an unbounded number of domains. This opens the door for investigation into transfer learning, and leveraging prior knowledge that has been learned over a lifetime of experiences to new domains.



from cs.AI updates on arXiv.org http://ift.tt/2eo3Ocf
via IFTTT

No comments: