Colloquium on December 15th, 2008

Nicolas Brunel
CNRS, Université Paris

Optimizing information storage in neural circuits: consequences on the statistics of synaptic connectivity

It is widely believed that synaptic modifications underlie learning and memory. This hypothesis has led to the study of many `learning rules' that implement in a simplified way how synaptic efficacy is controlled by neuronal activity. This talk will focus on a complementary research direction: investigating optimal storage properties in neural circuits.

The first part of the talk will focus on the perceptron, the simplest feed-forward network model, as a simplified model of the granule-Purkinje cell pathway in the cerebellum. The distribution of synaptic weights of a perceptron that optimizes storage capacity can be computed exactly. This distribution has two striking features: (i) it contains a large number (at least 50%) of exactly zero weights (`silent' or `potential' synapses); (ii) positive weights are distributed according to a monotonically decreasing function. We find that the theoretical distribution fits closely the distribution of synaptic weights of connections between granule cells and Purkinje cells, suggesting Purkinje cells function close to their optimal capacity in adult rats, which we estimate to be about 5Kb per cell.

In the second part of the talk, I will consider a network with a fully connected recurrent architecture, as a simplified model for local pyramidal cell networks of neocortex. If the network is asked to store a large number of fixed point attractor states, the distribution of synaptic weights turns out to be exactly the same as the one for a perceptron, and hence contains a large fraction of `silent', or `potential' synapses. Finally, I will consider the joint distribution of synaptic weights for pairs of neurons, and compare the theoretical results with recently published data on synaptic connectivity in cortical slices.