License 1 - License 2
Mentions: We propose a computational model that combines classical learning rules with structural changes in spiking neural network architectures that are based on observations on the morphological changes real biological synapses undergo during their live-cycle. Our model is based on the assumption that newly formed synapses are initially silent, due to their lack of AMPA receptors. In these synapses, only co-activation with other synapses can lead to postsynaptic potentials, and if this co-activation is not present for a critical period, the synapse degenerates again . To study the interaction of structural plasticity and classical STDP learning rules, we simulated a highly recurrently connected spiking neural network and presented topological inputs to its neurons. We implemented the triplet STDP learning rule proposed by Pfister and Gerstner , and applied a structural plasticity rule where a critical period is opened whenever a synaptic weight is decreased below a certain threshold. If the weight does not manage to reach a set threshold by the end of the critical period, the synapse is pruned, and a new synapse is instantiated within the network; otherwise the synapse is maintained. This approach implies a homeostasis in the number of consolidated synapses in the network, while keeping the connectivity at a desired level of sparseness. We show in Figure 1 simulation results in which the input topology of the network is first learned using only STDP, and then, after activating structural plasticity, the structure of the connectivity matrix itself is adapted such that it reflects the input topology.
No MeSH data available.