Limits...
Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus

The different networks investigated in this study. (A) A feed-forward network consists of M layers each with Nk neurons. Each neuron j m of each layer m connects to all neurons of the source-layer m − 1 and target-layer m + 1 and has no connections within its layer. In this example neuron n = a of the first layer k = 0 (pink) receives an input ℑ while the other neurons within this layer do not. (B) The smallest recurrent system: a neuron with a self-connection receiving external input. (C) A neuron receives an external input and is recurrently connected to another neuron. (D) Three neurons build a ring structure.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3376471&req=5

Figure 2: The different networks investigated in this study. (A) A feed-forward network consists of M layers each with Nk neurons. Each neuron j m of each layer m connects to all neurons of the source-layer m − 1 and target-layer m + 1 and has no connections within its layer. In this example neuron n = a of the first layer k = 0 (pink) receives an input ℑ while the other neurons within this layer do not. (B) The smallest recurrent system: a neuron with a self-connection receiving external input. (C) A neuron receives an external input and is recurrently connected to another neuron. (D) Three neurons build a ring structure.

Mentions: Given a strong external input synaptic plasticity combined with synaptic scaling (equation 1) leads to an input trace similar to a cell assembly. (A) Schematic of post-synaptic connectivity of selected neurons up to stage five. The red neuron receives a strong external input (yellow arrow). Parts of the descendent network stages are highlighted to show the general connectivity structures analyzed in this study (c.f. Figure 2). (B) Neural activities (dashed lines) and weights (solid lines) after stabilization found for the first four stages. Weights and activities of the stages linked to the external input (black) are significantly larger compared to control neurons (gray) over the first three stages representing an input trace (cell assembly).


Analysis of synaptic scaling in combination with hebbian plasticity in several simple networks.

Tetzlaff C, Kolodziejski C, Timme M, Wörgötter F - Front Comput Neurosci (2012)

The different networks investigated in this study. (A) A feed-forward network consists of M layers each with Nk neurons. Each neuron j m of each layer m connects to all neurons of the source-layer m − 1 and target-layer m + 1 and has no connections within its layer. In this example neuron n = a of the first layer k = 0 (pink) receives an input ℑ while the other neurons within this layer do not. (B) The smallest recurrent system: a neuron with a self-connection receiving external input. (C) A neuron receives an external input and is recurrently connected to another neuron. (D) Three neurons build a ring structure.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3376471&req=5

Figure 2: The different networks investigated in this study. (A) A feed-forward network consists of M layers each with Nk neurons. Each neuron j m of each layer m connects to all neurons of the source-layer m − 1 and target-layer m + 1 and has no connections within its layer. In this example neuron n = a of the first layer k = 0 (pink) receives an input ℑ while the other neurons within this layer do not. (B) The smallest recurrent system: a neuron with a self-connection receiving external input. (C) A neuron receives an external input and is recurrently connected to another neuron. (D) Three neurons build a ring structure.
Mentions: Given a strong external input synaptic plasticity combined with synaptic scaling (equation 1) leads to an input trace similar to a cell assembly. (A) Schematic of post-synaptic connectivity of selected neurons up to stage five. The red neuron receives a strong external input (yellow arrow). Parts of the descendent network stages are highlighted to show the general connectivity structures analyzed in this study (c.f. Figure 2). (B) Neural activities (dashed lines) and weights (solid lines) after stabilization found for the first four stages. Weights and activities of the stages linked to the external input (black) are significantly larger compared to control neurons (gray) over the first three stages representing an input trace (cell assembly).

Bottom Line: Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling.These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling.This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

View Article: PubMed Central - PubMed

Affiliation: III. Institute of Physics - Biophysics, Georg-August-Universität Göttingen Göttingen, Germany.

ABSTRACT
Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

No MeSH data available.


Related in: MedlinePlus