A Growing Long-term Episodic & Semantic Memory

Chris Tar
Marc Pickett
Rami Eid
Yuanlong Shao
NIPS Workshop on Continual Learning and Deep Networks (2016)
Google Scholar

Abstract

The long-term memory of most connectionist systems lies entirely in the weights
of the system. Since the number of weights is typically fixed, this bounds the
total amount of knowledge that can be learned and stored. Though this is not
normally a problem for a neural network designed for a specific task, such a bound
is undesirable for a system that continually learns over an open range of domains.
To address this, we describe a lifelong learning system that leverages a fast, though
non-differentiable, content-addressable memory which can be exploited to encode
both a long history of sequential episodic knowledge and semantic knowledge over
many episodes for an unbounded number of domains. This opens the door for
investigation into transfer learning, and leveraging prior knowledge that has been
learned over a lifetime of experiences to new domains.

Research Areas