Limits...
A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model.

Fasoli D, Faugeras O, Panzeri S - J Math Neurosci (2015)

Bottom Line: In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs.To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent.Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

View Article: PubMed Central - PubMed

Affiliation: NeuroMathComp Laboratory, INRIA Sophia Antipolis Méditerranée, 2004 Route des Lucioles, BP 93, 06902 Valbonne, France ; Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @Unitn, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy.

ABSTRACT
We introduce a new formalism for evaluating analytically the cross-correlation structure of a finite-size firing-rate network with recurrent connections. The analysis performs a first-order perturbative expansion of neural activity equations that include three different sources of randomness: the background noise of the membrane potentials, their initial conditions, and the distribution of the recurrent synaptic weights. This allows the analytical quantification of the relationship between anatomical and functional connectivity, i.e. of how the synaptic connections determine the statistical dependencies at any order among different neurons. The technique we develop is general, but for simplicity and clarity we demonstrate its efficacy by applying it to the case of synaptic connections described by regular graphs. The analytical equations so obtained reveal previously unknown behaviors of recurrent firing-rate networks, especially on how correlations are modified by the external input, by the finite size of the network, by the density of the anatomical connections and by correlation in sources of randomness. In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs. Moreover we prove that in general it is not possible to find a mean-field description à la Sznitman of the network, if the anatomical connections are too sparse or our three sources of variability are correlated. To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent. Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

No MeSH data available.


Related in: MedlinePlus

Single-neuron marginal-probability density for the membrane potential (left) and the firing rate (right) in a network with topology  (top) and  (bottom). The parameters used for the simulation are , , and those of Table 1 and Eq. (7.1). The numerical probability density has been calculated by simulating equations in (2.1)  times with the Euler–Maruyama scheme and then by applying a Monte Carlo method, while the analytical density has been evaluated by integrating Eqs. (4.11) + (4.12) over all but one dimension. From the comparison it is easy to observe that the mean and the variance of the numerical simulations are in good agreement with the corresponding analytical quantities, even if the numerical probability density is not perfectly normal, due to relatively small higher-order corrections that have been neglected in our first-order perturbative approach
© Copyright Policy - OpenAccess
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4385226&req=5

Fig7: Single-neuron marginal-probability density for the membrane potential (left) and the firing rate (right) in a network with topology (top) and (bottom). The parameters used for the simulation are , , and those of Table 1 and Eq. (7.1). The numerical probability density has been calculated by simulating equations in (2.1) times with the Euler–Maruyama scheme and then by applying a Monte Carlo method, while the analytical density has been evaluated by integrating Eqs. (4.11) + (4.12) over all but one dimension. From the comparison it is easy to observe that the mean and the variance of the numerical simulations are in good agreement with the corresponding analytical quantities, even if the numerical probability density is not perfectly normal, due to relatively small higher-order corrections that have been neglected in our first-order perturbative approach

Mentions: Parameters used for the numerical simulations of Figs.5,6,7and the right-hand side of Fig. 8. For the left-hand side of Fig. 8and for Fig. 9the parameters are the same, with only the exception of,and, which have been set to zero


A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model.

Fasoli D, Faugeras O, Panzeri S - J Math Neurosci (2015)

Single-neuron marginal-probability density for the membrane potential (left) and the firing rate (right) in a network with topology  (top) and  (bottom). The parameters used for the simulation are , , and those of Table 1 and Eq. (7.1). The numerical probability density has been calculated by simulating equations in (2.1)  times with the Euler–Maruyama scheme and then by applying a Monte Carlo method, while the analytical density has been evaluated by integrating Eqs. (4.11) + (4.12) over all but one dimension. From the comparison it is easy to observe that the mean and the variance of the numerical simulations are in good agreement with the corresponding analytical quantities, even if the numerical probability density is not perfectly normal, due to relatively small higher-order corrections that have been neglected in our first-order perturbative approach
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4385226&req=5

Fig7: Single-neuron marginal-probability density for the membrane potential (left) and the firing rate (right) in a network with topology (top) and (bottom). The parameters used for the simulation are , , and those of Table 1 and Eq. (7.1). The numerical probability density has been calculated by simulating equations in (2.1) times with the Euler–Maruyama scheme and then by applying a Monte Carlo method, while the analytical density has been evaluated by integrating Eqs. (4.11) + (4.12) over all but one dimension. From the comparison it is easy to observe that the mean and the variance of the numerical simulations are in good agreement with the corresponding analytical quantities, even if the numerical probability density is not perfectly normal, due to relatively small higher-order corrections that have been neglected in our first-order perturbative approach
Mentions: Parameters used for the numerical simulations of Figs.5,6,7and the right-hand side of Fig. 8. For the left-hand side of Fig. 8and for Fig. 9the parameters are the same, with only the exception of,and, which have been set to zero

Bottom Line: In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs.To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent.Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

View Article: PubMed Central - PubMed

Affiliation: NeuroMathComp Laboratory, INRIA Sophia Antipolis Méditerranée, 2004 Route des Lucioles, BP 93, 06902 Valbonne, France ; Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @Unitn, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy.

ABSTRACT
We introduce a new formalism for evaluating analytically the cross-correlation structure of a finite-size firing-rate network with recurrent connections. The analysis performs a first-order perturbative expansion of neural activity equations that include three different sources of randomness: the background noise of the membrane potentials, their initial conditions, and the distribution of the recurrent synaptic weights. This allows the analytical quantification of the relationship between anatomical and functional connectivity, i.e. of how the synaptic connections determine the statistical dependencies at any order among different neurons. The technique we develop is general, but for simplicity and clarity we demonstrate its efficacy by applying it to the case of synaptic connections described by regular graphs. The analytical equations so obtained reveal previously unknown behaviors of recurrent firing-rate networks, especially on how correlations are modified by the external input, by the finite size of the network, by the density of the anatomical connections and by correlation in sources of randomness. In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs. Moreover we prove that in general it is not possible to find a mean-field description à la Sznitman of the network, if the anatomical connections are too sparse or our three sources of variability are correlated. To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent. Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

No MeSH data available.


Related in: MedlinePlus