Limits...
Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus

Different recurrent structures are stabilized by SPaSS rule. For different parameter values κ and vT the maximal input ℑmax leading to stable weights is calculated and shown in color code (note the different scale bars). (A) Self-connected neurons, (B) bi-directional structures, and (C) three-neuron rings are analyzed. If one structure is stabilized for a given parameter set (κ, vT) all other structures are stable, too. Only the maximal input decreases with more direct (less neurons) recurrences.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3376471&req=5

Figure 7: Different recurrent structures are stabilized by SPaSS rule. For different parameter values κ and vT the maximal input ℑmax leading to stable weights is calculated and shown in color code (note the different scale bars). (A) Self-connected neurons, (B) bi-directional structures, and (C) three-neuron rings are analyzed. If one structure is stabilized for a given parameter set (κ, vT) all other structures are stable, too. Only the maximal input decreases with more direct (less neurons) recurrences.

Mentions: These solutions have to be real (complex part equals zero) and the derivative of equation (19) at these solutions has to be smaller than zero in order to stabilize the dynamics, so parameters ℑ, κ, and vT have to be restricted. The calculations for assessing these parameters are complex and closed-forms can not be derived anymore. Therefore, the stability for a given set of parameters has to be assessed by numerical calculations of the roots. To reduce the number of parameters only the ratio of the rates (e.g., equations 8–22) κ = μ/γ is considered and we plot the maximally allowed input ℑmax that still leads to a stable weight. All smaller inputs (0 < ℑ < ℑmax) lead to a stable weight ω, too, and inputs above this value (ℑ > ℑmax) lead to divergent weight dynamics. Thus, we show the parameters which lead to stable dynamics in a κ-vT-plot with ℑmax color coded (Figure 7A). The self-connected neuron stabilizes best if γ is as large as μ (κ ≈ 1) but for the more realistic ratios of one order of magnitude difference (κ ≈ 10) stabilization is still possible for a wide regime. Furthermore, stability requires that vT is small or even negative. Here we stress that (1) negative vT does not imply negative neuronal firing rates as these systems balance plasticity and scaling ( see equation 3 and Tetzlaff et al., 2011) and (2) negative vT values can be avoided by simply adding inhibition to any of these recurrent networks (Figures 2B–D). As shown above, constant weight inhibition (Figures 6D–I) leads to a shift of the network properties toward larger allowed positive values. The general disk-like shape of the stability plot, however, does not change with inhibition added and we will thus continue to consider only excitatory recurrent networks (although vT might be negative).


Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Different recurrent structures are stabilized by SPaSS rule. For different parameter values κ and vT the maximal input ℑmax leading to stable weights is calculated and shown in color code (note the different scale bars). (A) Self-connected neurons, (B) bi-directional structures, and (C) three-neuron rings are analyzed. If one structure is stabilized for a given parameter set (κ, vT) all other structures are stable, too. Only the maximal input decreases with more direct (less neurons) recurrences.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3376471&req=5

Figure 7: Different recurrent structures are stabilized by SPaSS rule. For different parameter values κ and vT the maximal input ℑmax leading to stable weights is calculated and shown in color code (note the different scale bars). (A) Self-connected neurons, (B) bi-directional structures, and (C) three-neuron rings are analyzed. If one structure is stabilized for a given parameter set (κ, vT) all other structures are stable, too. Only the maximal input decreases with more direct (less neurons) recurrences.
Mentions: These solutions have to be real (complex part equals zero) and the derivative of equation (19) at these solutions has to be smaller than zero in order to stabilize the dynamics, so parameters ℑ, κ, and vT have to be restricted. The calculations for assessing these parameters are complex and closed-forms can not be derived anymore. Therefore, the stability for a given set of parameters has to be assessed by numerical calculations of the roots. To reduce the number of parameters only the ratio of the rates (e.g., equations 8–22) κ = μ/γ is considered and we plot the maximally allowed input ℑmax that still leads to a stable weight. All smaller inputs (0 < ℑ < ℑmax) lead to a stable weight ω, too, and inputs above this value (ℑ > ℑmax) lead to divergent weight dynamics. Thus, we show the parameters which lead to stable dynamics in a κ-vT-plot with ℑmax color coded (Figure 7A). The self-connected neuron stabilizes best if γ is as large as μ (κ ≈ 1) but for the more realistic ratios of one order of magnitude difference (κ ≈ 10) stabilization is still possible for a wide regime. Furthermore, stability requires that vT is small or even negative. Here we stress that (1) negative vT does not imply negative neuronal firing rates as these systems balance plasticity and scaling ( see equation 3 and Tetzlaff et al., 2011) and (2) negative vT values can be avoided by simply adding inhibition to any of these recurrent networks (Figures 2B–D). As shown above, constant weight inhibition (Figures 6D–I) leads to a shift of the network properties toward larger allowed positive values. The general disk-like shape of the stability plot, however, does not change with inhibition added and we will thus continue to consider only excitatory recurrent networks (although vT might be negative).

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus