Limits...
Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus

Inhibition and recurrent connections influence fixed points of activity and weights from layer to layer. The (A,D,G) show the phase space of the modified feed-forward networks (blue lines show the control from Figure 4A). Different structures results in different activity (B,E,H) and weight (C,F,I) development over stages m. (A–C) In the feed-forward network there are constant excitatory recurrent connections of weight ωR within each layer. These recurrences shift the constraints vmin and vmax of the fixed points and narrow the regime of possible inputs. However, the length of a trace and its difference are larger for stronger recurrences (RωR). (D–F) Fixed points with feed-forward inhibition (G–I). As (A–C), however, with inhibitory recurrent connections. Inhibition (feed-forward and feedback) enlarges the regime of inputs resulting in stable development over layers. However, length and difference of the traces are shortened. Parameters: κ = 2, vT = 10−2, N = 1, and RωR/I = 0, 0.1, …, 0.5. For (B,C,E,F,H,I) ℑ = 0.2.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3376471&req=5

Figure 6: Inhibition and recurrent connections influence fixed points of activity and weights from layer to layer. The (A,D,G) show the phase space of the modified feed-forward networks (blue lines show the control from Figure 4A). Different structures results in different activity (B,E,H) and weight (C,F,I) development over stages m. (A–C) In the feed-forward network there are constant excitatory recurrent connections of weight ωR within each layer. These recurrences shift the constraints vmin and vmax of the fixed points and narrow the regime of possible inputs. However, the length of a trace and its difference are larger for stronger recurrences (RωR). (D–F) Fixed points with feed-forward inhibition (G–I). As (A–C), however, with inhibitory recurrent connections. Inhibition (feed-forward and feedback) enlarges the regime of inputs resulting in stable development over layers. However, length and difference of the traces are shortened. Parameters: κ = 2, vT = 10−2, N = 1, and RωR/I = 0, 0.1, …, 0.5. For (B,C,E,F,H,I) ℑ = 0.2.

Mentions: In this section we consider step by step our feed-forward network with -i- lateral excitatory connections within layers (top row in Figure 6), -ii- feed-forward inhibition (middle row in Figure 6), and -iii- feedback (self-)inhibition (bottom row in Figure 6) while all intra-layer connections stay constant. Under these constraints, analytical results can still be obtained and we state only the main equations here. All calculations – similar to those shown above – are provided in the Appendix. In principle we could also do analytics for mixtures of the structures presented in Figure 6 but calculations get lengthy and opaque then, and the same is true for plastic intra-layer connections.


Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Inhibition and recurrent connections influence fixed points of activity and weights from layer to layer. The (A,D,G) show the phase space of the modified feed-forward networks (blue lines show the control from Figure 4A). Different structures results in different activity (B,E,H) and weight (C,F,I) development over stages m. (A–C) In the feed-forward network there are constant excitatory recurrent connections of weight ωR within each layer. These recurrences shift the constraints vmin and vmax of the fixed points and narrow the regime of possible inputs. However, the length of a trace and its difference are larger for stronger recurrences (RωR). (D–F) Fixed points with feed-forward inhibition (G–I). As (A–C), however, with inhibitory recurrent connections. Inhibition (feed-forward and feedback) enlarges the regime of inputs resulting in stable development over layers. However, length and difference of the traces are shortened. Parameters: κ = 2, vT = 10−2, N = 1, and RωR/I = 0, 0.1, …, 0.5. For (B,C,E,F,H,I) ℑ = 0.2.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3376471&req=5

Figure 6: Inhibition and recurrent connections influence fixed points of activity and weights from layer to layer. The (A,D,G) show the phase space of the modified feed-forward networks (blue lines show the control from Figure 4A). Different structures results in different activity (B,E,H) and weight (C,F,I) development over stages m. (A–C) In the feed-forward network there are constant excitatory recurrent connections of weight ωR within each layer. These recurrences shift the constraints vmin and vmax of the fixed points and narrow the regime of possible inputs. However, the length of a trace and its difference are larger for stronger recurrences (RωR). (D–F) Fixed points with feed-forward inhibition (G–I). As (A–C), however, with inhibitory recurrent connections. Inhibition (feed-forward and feedback) enlarges the regime of inputs resulting in stable development over layers. However, length and difference of the traces are shortened. Parameters: κ = 2, vT = 10−2, N = 1, and RωR/I = 0, 0.1, …, 0.5. For (B,C,E,F,H,I) ℑ = 0.2.
Mentions: In this section we consider step by step our feed-forward network with -i- lateral excitatory connections within layers (top row in Figure 6), -ii- feed-forward inhibition (middle row in Figure 6), and -iii- feedback (self-)inhibition (bottom row in Figure 6) while all intra-layer connections stay constant. Under these constraints, analytical results can still be obtained and we state only the main equations here. All calculations – similar to those shown above – are provided in the Appendix. In principle we could also do analytics for mixtures of the structures presented in Figure 6 but calculations get lengthy and opaque then, and the same is true for plastic intra-layer connections.

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus