We explored progression of response dynamics in a feedforward neural network with spike frequency adapting (SFA) neurons (Farkhooi et al., 2010) using the single neuron model described in Müller et al. (2007). Our results suggest that a mechanism of progressive adaptation in sensory feedforward networks provides (i) a means to focus on dynamic changes in the sensory input, and (ii) a mechanism for shortening integration time to foster the projection of coincident converging inputs.
Our model observations are reminiscent of the experimentally observed stimulus response dynamics in the insect olfactory pathway, which essentially establishes a 3 layer excitatory feedforward network where the information progresses from olfactory receptor neurons (ORNs) via the projection neurons (PNs) in the antennal lobe to the Kenyon cells (KCs) in the mushroom body. In response to a constant stimulus, the 2nd order PNs exhibit a phasic-tonic rate profile, while the 3rd order KCs show a very brief response composed of only few spikes, a phenomenon described as temporal response sparseness.
A network implementation with SFA neurons can reproduce the experimental findings if the level of SFA is strong in KCs, an assumption that matches with recent experimental results by Demmer and Kloppenburg (2009). New experimental results in the honeybee show that blocking of inhibition does not alter the temporal sparseness of the KC population responses but increases its magnitude. We reproduced this result in our network model where local inhibition tunes the PN oder code and thereby regulates spatial response properties in the KC population.
Demmer H, Kloppenburg P (2009) Intrinsic membrane properties and inhibitory synaptic input of kenyon cells as mechanisms for sparse coding? J Neurophysiol. 102(3):1538-50
Farkhooi F, Müller E, Nawrot MP (2010) Sequential sparsening by successive adaptation in neural populations. arXiv:1007.2345v1
Müller E, Buesing L, Schemmel J, Meier K (2007) Spike-frequency adapting neural ensembles: beyond mean adaptation and renewal theories. Neural Comput 19(11):2958-3010