Enhancing recurrent neural networks with biologically inspired synaptic plasticity.
Duggento A., Toschi N.
Synaptic plasticity in brain cells is the fundamental mechanism that modulates synaptic strengths and lets us form memories, recall experience, and learn new tasks without forgetting previous ones. Hebbian/anti-Hebbian plasticity theory postulates that the efficacy of synaptic connections between two neurons increases/decreases with pre- and postsynaptic synchronicity. We investigate Hebbian plasticity in Echo State Networks (ESN), a class of artificial recurrent neural networks which consists of an input layer, a hidden layer (or reservoir) and an output layer. ESN are especially suited to predict and classify temporal data while minimizing overtraining. We study how different biologically inspired plasticity rules can be applied to the connections (synapses) between reservoir neurons and demonstrate that including plasticity can improve classification performance. Further, akin to its biological counterpart, we show that synaptic plasticity helps in retaining high performances during sequential learning, where learning new tasks does not necessarily lead to catastrophic forgetting of previously learned tasks.