Limits...
Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus

A noisy external input leads to propagation of activity along several stages. The first neuron (red) receives a time varying (noisy) input which leads to a strong post-synaptic synapse. Thus, the activity is transmitted to the next stage (green) resulting in high (but smaller than the input) activity and weight. This transmission occurs along several stages until the activity vanishes (gray). Even a short decrease of the input can be compensated quite fast.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3376471&req=5

Figure 3: A noisy external input leads to propagation of activity along several stages. The first neuron (red) receives a time varying (noisy) input which leads to a strong post-synaptic synapse. Thus, the activity is transmitted to the next stage (green) resulting in high (but smaller than the input) activity and weight. This transmission occurs along several stages until the activity vanishes (gray). Even a short decrease of the input can be compensated quite fast.

Mentions: The simplified structure in Figure 2A depicts how an input signal to a random network is traveling along topological stages (layers) within this network and how the network “reacts” to the input by adapting weights and activities. For this we simulate the behavior of such a feed-forward network with one neuron per stage (c.f. Figures 1A and 3) given a noisy input to layer one (activity of red neuron). As shown before (Figure 1), this activity leads to a strong post-synaptic synapse resulting in high (but not as high as red) activity at layer 1 (green neuron). This behavior is propagated until the activity vanishes in layer 4 (gray). This is the basic property of the above described input trace (cell assembly). Even if the input is reduced for a certain duration the network can re-adapt to its previous stable state very fast (c.f. Figure 3). Thus, even with a noisy external input, the network can construct a stable cell assembly along several stages.


Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

A noisy external input leads to propagation of activity along several stages. The first neuron (red) receives a time varying (noisy) input which leads to a strong post-synaptic synapse. Thus, the activity is transmitted to the next stage (green) resulting in high (but smaller than the input) activity and weight. This transmission occurs along several stages until the activity vanishes (gray). Even a short decrease of the input can be compensated quite fast.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3376471&req=5

Figure 3: A noisy external input leads to propagation of activity along several stages. The first neuron (red) receives a time varying (noisy) input which leads to a strong post-synaptic synapse. Thus, the activity is transmitted to the next stage (green) resulting in high (but smaller than the input) activity and weight. This transmission occurs along several stages until the activity vanishes (gray). Even a short decrease of the input can be compensated quite fast.
Mentions: The simplified structure in Figure 2A depicts how an input signal to a random network is traveling along topological stages (layers) within this network and how the network “reacts” to the input by adapting weights and activities. For this we simulate the behavior of such a feed-forward network with one neuron per stage (c.f. Figures 1A and 3) given a noisy input to layer one (activity of red neuron). As shown before (Figure 1), this activity leads to a strong post-synaptic synapse resulting in high (but not as high as red) activity at layer 1 (green neuron). This behavior is propagated until the activity vanishes in layer 4 (gray). This is the basic property of the above described input trace (cell assembly). Even if the input is reduced for a certain duration the network can re-adapt to its previous stable state very fast (c.f. Figure 3). Thus, even with a noisy external input, the network can construct a stable cell assembly along several stages.

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus