Limits...
Persistent activity in neural networks with dynamic synapses.

Barak O, Tsodyks M - PLoS Comput. Biol. (2007)

Bottom Line: One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections.Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks.This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli.

View Article: PubMed Central - PubMed

Affiliation: Department of Neurobiology, The Weizmann Institute of Science, Rehovot, Israel.

ABSTRACT
Persistent activity states (attractors), observed in several neocortical areas after the removal of a sensory stimulus, are believed to be the neuronal basis of working memory. One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections. A recent experimental study revealed that connections between pyramidal cells in prefrontal cortex exhibit various degrees of synaptic depression and facilitation. Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks. We show that different combinations of synaptic depression and facilitation result in qualitatively different network dynamics with respect to the emergence of the attractor states. This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli.

Show MeSH

Related in: MedlinePlus

Network StructureThe network is divided into several populations, each responding primarily to a certain stimulus. Each population is further partitioned into subpopulations, differing in their synaptic properties. Connections are strongest within subpopulations, weaker between subpopulations, and weakest across populations.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC1808024&req=5

pcbi-0030035-g001: Network StructureThe network is divided into several populations, each responding primarily to a certain stimulus. Each population is further partitioned into subpopulations, differing in their synaptic properties. Connections are strongest within subpopulations, weaker between subpopulations, and weakest across populations.

Mentions: In this contribution, we consider an attractor neural network with connections that have already been formed by learning several stimuli [14,15]. We assume that the network comprises a set of neuronal populations, each responding primarily to a certain stimulus. This scheme, via Hebbian learning, can strengthen the synaptic connections within a population and form a stable activity state. Drawing on recent experimental results [12], we assume that the neurons within each population differ in the dynamic properties of their synapses and thus exhibit different temporal response profiles to the same stimuli. This firing can then lead to a further differentiation of synaptic strengths within the population, whereby neurons with similar synaptic dynamics are connected more strongly to one another than to ones with dissimilar synaptic dynamics. We thus consider a network comprising several attractor populations, each divided into subpopulations with different synaptic dynamics (Figure 1). These populations interact via both excitatory dynamic synapses and inhibition to generate rich dynamics in response to external stimuli. Since the synaptic dynamics differ between subpopulations, we expect them to respond differently to different temporal profiles of the input, which could result in a greater computational power for the network.


Persistent activity in neural networks with dynamic synapses.

Barak O, Tsodyks M - PLoS Comput. Biol. (2007)

Network StructureThe network is divided into several populations, each responding primarily to a certain stimulus. Each population is further partitioned into subpopulations, differing in their synaptic properties. Connections are strongest within subpopulations, weaker between subpopulations, and weakest across populations.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC1808024&req=5

pcbi-0030035-g001: Network StructureThe network is divided into several populations, each responding primarily to a certain stimulus. Each population is further partitioned into subpopulations, differing in their synaptic properties. Connections are strongest within subpopulations, weaker between subpopulations, and weakest across populations.
Mentions: In this contribution, we consider an attractor neural network with connections that have already been formed by learning several stimuli [14,15]. We assume that the network comprises a set of neuronal populations, each responding primarily to a certain stimulus. This scheme, via Hebbian learning, can strengthen the synaptic connections within a population and form a stable activity state. Drawing on recent experimental results [12], we assume that the neurons within each population differ in the dynamic properties of their synapses and thus exhibit different temporal response profiles to the same stimuli. This firing can then lead to a further differentiation of synaptic strengths within the population, whereby neurons with similar synaptic dynamics are connected more strongly to one another than to ones with dissimilar synaptic dynamics. We thus consider a network comprising several attractor populations, each divided into subpopulations with different synaptic dynamics (Figure 1). These populations interact via both excitatory dynamic synapses and inhibition to generate rich dynamics in response to external stimuli. Since the synaptic dynamics differ between subpopulations, we expect them to respond differently to different temporal profiles of the input, which could result in a greater computational power for the network.

Bottom Line: One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections.Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks.This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli.

View Article: PubMed Central - PubMed

Affiliation: Department of Neurobiology, The Weizmann Institute of Science, Rehovot, Israel.

ABSTRACT
Persistent activity states (attractors), observed in several neocortical areas after the removal of a sensory stimulus, are believed to be the neuronal basis of working memory. One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections. A recent experimental study revealed that connections between pyramidal cells in prefrontal cortex exhibit various degrees of synaptic depression and facilitation. Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks. We show that different combinations of synaptic depression and facilitation result in qualitatively different network dynamics with respect to the emergence of the attractor states. This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli.

Show MeSH
Related in: MedlinePlus