Google Research

Delay Learning and Polychronization for Reservoir Computing

Neurocomputing, vol. 71 (2008), pp. 1143-1158

Abstract

We propose a multi-scale learning rule for spiking neuron networks, in the vein of the recently emerging field of reservoir computing. The reservoir is a network model of spiking neurons, with random topology and driven by STDP (Spike-Time-Dependent Plasticity), a temporal Hebbian unsupervised learning mode, biologically observed. The model is further driven by a supervised learning algorithm, based on a margin criterion, that effects the synaptic delays linking the network to the readout neurons, with classification as a goal task. The network processing and the resulting performance can be explained by the concept of polychronization, proposed by Izhikevich (2006, Neural Computation, 18,1), on physiological bases. The model emphasizes the computational capabilities of this concept.

Research Areas

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work