Limits...
The interplay of plasticity and adaptation in neural circuits: a generative model.

Bernacchia A - Front Synaptic Neurosci (2014)

Bottom Line: The neural circuit is able to generate spontaneous patterns of activity that reproduce exactly the probability distribution of experienced stimuli.In particular, the landscape of the phase space includes a large number of stable states (attractors) that sample precisely this prior distribution.This work demonstrates that the interplay between distinct dynamical processes gives rise to useful computation, and proposes a framework in which neural circuit models for Bayesian inference may be developed in the future.

View Article: PubMed Central - PubMed

Affiliation: School of Engineering and Science, Jacobs University Bremen Bremen, Germany.

ABSTRACT
Multiple neural and synaptic phenomena take place in the brain. They operate over a broad range of timescales, and the consequences of their interplay are still unclear. In this work, I study a computational model of a recurrent neural network in which two dynamic processes take place: sensory adaptation and synaptic plasticity. Both phenomena are ubiquitous in the brain, but their dynamic interplay has not been investigated. I show that when both processes are included, the neural circuit is able to perform a specific computation: it becomes a generative model for certain distributions of input stimuli. The neural circuit is able to generate spontaneous patterns of activity that reproduce exactly the probability distribution of experienced stimuli. In particular, the landscape of the phase space includes a large number of stable states (attractors) that sample precisely this prior distribution. This work demonstrates that the interplay between distinct dynamical processes gives rise to useful computation, and proposes a framework in which neural circuit models for Bayesian inference may be developed in the future.

No MeSH data available.


The adaptation mechanism, by which tuning curves of neurons are modified according to the presented stimulus. Tuning curves of two neurons are shown, one neuron in blue and the other one in red, before and after adaptation (full and dashed line, respectively). The presented stimulus is indicated by the black dot and the vertical black line. The tuning offsets of the two neurons are shown by the blue and red dots. Tuning offsets are attracted by the stimulus, as shown by the arrows. Left: sigmoidal tuning curves. Right: periodic tuning curves.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4214225&req=5

Figure 2: The adaptation mechanism, by which tuning curves of neurons are modified according to the presented stimulus. Tuning curves of two neurons are shown, one neuron in blue and the other one in red, before and after adaptation (full and dashed line, respectively). The presented stimulus is indicated by the black dot and the vertical black line. The tuning offsets of the two neurons are shown by the blue and red dots. Tuning offsets are attracted by the stimulus, as shown by the arrows. Left: sigmoidal tuning curves. Right: periodic tuning curves.

Mentions: The tuning curve of a neuron is modified during presentation of stimuli, as a consequence of adaptation. We implement a phenomenon known as adaptation to the mean, which is ubiquitously observed in a wide range of species, sensory modalities and stimulus variables (Wark et al., 2007; Rieke and Rudd, 2009). In particular, the presentation of a given stimulus determines a change in the tuning offsets of neurons such that they tend to converge toward that stimulus. An illustration of this dynamics is shown in Figure 2. The tuning offset is a function time, μi(t), and changes according to


The interplay of plasticity and adaptation in neural circuits: a generative model.

Bernacchia A - Front Synaptic Neurosci (2014)

The adaptation mechanism, by which tuning curves of neurons are modified according to the presented stimulus. Tuning curves of two neurons are shown, one neuron in blue and the other one in red, before and after adaptation (full and dashed line, respectively). The presented stimulus is indicated by the black dot and the vertical black line. The tuning offsets of the two neurons are shown by the blue and red dots. Tuning offsets are attracted by the stimulus, as shown by the arrows. Left: sigmoidal tuning curves. Right: periodic tuning curves.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4214225&req=5

Figure 2: The adaptation mechanism, by which tuning curves of neurons are modified according to the presented stimulus. Tuning curves of two neurons are shown, one neuron in blue and the other one in red, before and after adaptation (full and dashed line, respectively). The presented stimulus is indicated by the black dot and the vertical black line. The tuning offsets of the two neurons are shown by the blue and red dots. Tuning offsets are attracted by the stimulus, as shown by the arrows. Left: sigmoidal tuning curves. Right: periodic tuning curves.
Mentions: The tuning curve of a neuron is modified during presentation of stimuli, as a consequence of adaptation. We implement a phenomenon known as adaptation to the mean, which is ubiquitously observed in a wide range of species, sensory modalities and stimulus variables (Wark et al., 2007; Rieke and Rudd, 2009). In particular, the presentation of a given stimulus determines a change in the tuning offsets of neurons such that they tend to converge toward that stimulus. An illustration of this dynamics is shown in Figure 2. The tuning offset is a function time, μi(t), and changes according to

Bottom Line: The neural circuit is able to generate spontaneous patterns of activity that reproduce exactly the probability distribution of experienced stimuli.In particular, the landscape of the phase space includes a large number of stable states (attractors) that sample precisely this prior distribution.This work demonstrates that the interplay between distinct dynamical processes gives rise to useful computation, and proposes a framework in which neural circuit models for Bayesian inference may be developed in the future.

View Article: PubMed Central - PubMed

Affiliation: School of Engineering and Science, Jacobs University Bremen Bremen, Germany.

ABSTRACT
Multiple neural and synaptic phenomena take place in the brain. They operate over a broad range of timescales, and the consequences of their interplay are still unclear. In this work, I study a computational model of a recurrent neural network in which two dynamic processes take place: sensory adaptation and synaptic plasticity. Both phenomena are ubiquitous in the brain, but their dynamic interplay has not been investigated. I show that when both processes are included, the neural circuit is able to perform a specific computation: it becomes a generative model for certain distributions of input stimuli. The neural circuit is able to generate spontaneous patterns of activity that reproduce exactly the probability distribution of experienced stimuli. In particular, the landscape of the phase space includes a large number of stable states (attractors) that sample precisely this prior distribution. This work demonstrates that the interplay between distinct dynamical processes gives rise to useful computation, and proposes a framework in which neural circuit models for Bayesian inference may be developed in the future.

No MeSH data available.