Limits...
Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus

The phase space of a feed-forward network and how the representation of a given input depends on the parameter vT. (A–C) The constraints vmin, vmax (vertical black dashed lines), and ωequ (purple dashed line) define the direction of change of weight and activity (arrows) from layer to layer on the fixed point curve (colored continuous line). (A1–A6) The sub-panels show the fixed point values of the weights (top) and activities (bottom) of the layers at different regimes (colored according to the phase space). (D,E) Resulting from these curves the number of layers m needed to reach the global fixed point (thus, representing an input) varies for different vT. Dashed lines are the respective vmin lines. For more details see main text. Parameters: k = 2, N = 1; (A)vT = 10−3; (B)vT = 10−2; (C)vT = 10−1(D–E) ℑ = 0.3.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3376471&req=5

Figure 4: The phase space of a feed-forward network and how the representation of a given input depends on the parameter vT. (A–C) The constraints vmin, vmax (vertical black dashed lines), and ωequ (purple dashed line) define the direction of change of weight and activity (arrows) from layer to layer on the fixed point curve (colored continuous line). (A1–A6) The sub-panels show the fixed point values of the weights (top) and activities (bottom) of the layers at different regimes (colored according to the phase space). (D,E) Resulting from these curves the number of layers m needed to reach the global fixed point (thus, representing an input) varies for different vT. Dashed lines are the respective vmin lines. For more details see main text. Parameters: k = 2, N = 1; (A)vT = 10−3; (B)vT = 10−2; (C)vT = 10−1(D–E) ℑ = 0.3.

Mentions: The two constraints equations (10) and (11), and the cline for the weights (equation A1 in Appendix) define the weight-activity phase space (cf. Figures 4A–C) as well as the development of the fixed points of activity and weight along layers or rather stages m (cf. Figures 4D–E) in a feed-forward network. The trajectory depends on the terminal firing rate vT and the number of neurons N (c.f. Figure 4).


Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

The phase space of a feed-forward network and how the representation of a given input depends on the parameter vT. (A–C) The constraints vmin, vmax (vertical black dashed lines), and ωequ (purple dashed line) define the direction of change of weight and activity (arrows) from layer to layer on the fixed point curve (colored continuous line). (A1–A6) The sub-panels show the fixed point values of the weights (top) and activities (bottom) of the layers at different regimes (colored according to the phase space). (D,E) Resulting from these curves the number of layers m needed to reach the global fixed point (thus, representing an input) varies for different vT. Dashed lines are the respective vmin lines. For more details see main text. Parameters: k = 2, N = 1; (A)vT = 10−3; (B)vT = 10−2; (C)vT = 10−1(D–E) ℑ = 0.3.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3376471&req=5

Figure 4: The phase space of a feed-forward network and how the representation of a given input depends on the parameter vT. (A–C) The constraints vmin, vmax (vertical black dashed lines), and ωequ (purple dashed line) define the direction of change of weight and activity (arrows) from layer to layer on the fixed point curve (colored continuous line). (A1–A6) The sub-panels show the fixed point values of the weights (top) and activities (bottom) of the layers at different regimes (colored according to the phase space). (D,E) Resulting from these curves the number of layers m needed to reach the global fixed point (thus, representing an input) varies for different vT. Dashed lines are the respective vmin lines. For more details see main text. Parameters: k = 2, N = 1; (A)vT = 10−3; (B)vT = 10−2; (C)vT = 10−1(D–E) ℑ = 0.3.
Mentions: The two constraints equations (10) and (11), and the cline for the weights (equation A1 in Appendix) define the weight-activity phase space (cf. Figures 4A–C) as well as the development of the fixed points of activity and weight along layers or rather stages m (cf. Figures 4D–E) in a feed-forward network. The trajectory depends on the terminal firing rate vT and the number of neurons N (c.f. Figure 4).

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus