Limits...
A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model.

Fasoli D, Faugeras O, Panzeri S - J Math Neurosci (2015)

Bottom Line: In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs.To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent.Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

View Article: PubMed Central - PubMed

Affiliation: NeuroMathComp Laboratory, INRIA Sophia Antipolis Méditerranée, 2004 Route des Lucioles, BP 93, 06902 Valbonne, France ; Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @Unitn, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy.

ABSTRACT
We introduce a new formalism for evaluating analytically the cross-correlation structure of a finite-size firing-rate network with recurrent connections. The analysis performs a first-order perturbative expansion of neural activity equations that include three different sources of randomness: the background noise of the membrane potentials, their initial conditions, and the distribution of the recurrent synaptic weights. This allows the analytical quantification of the relationship between anatomical and functional connectivity, i.e. of how the synaptic connections determine the statistical dependencies at any order among different neurons. The technique we develop is general, but for simplicity and clarity we demonstrate its efficacy by applying it to the case of synaptic connections described by regular graphs. The analytical equations so obtained reveal previously unknown behaviors of recurrent firing-rate networks, especially on how correlations are modified by the external input, by the finite size of the network, by the density of the anatomical connections and by correlation in sources of randomness. In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs. Moreover we prove that in general it is not possible to find a mean-field description à la Sznitman of the network, if the anatomical connections are too sparse or our three sources of variability are correlated. To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent. Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

No MeSH data available.


Related in: MedlinePlus

Examples of the block-circulant graphs for different values of F and ξ, with G fixed. The figure on the top represents the case , obtained for , , and . The figure at the bottom shows some examples of the special case  (circulant graph) for , namely  (cyclic graph), , and finally  (complete graph)
© Copyright Policy - OpenAccess
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4385226&req=5

Fig2: Examples of the block-circulant graphs for different values of F and ξ, with G fixed. The figure on the top represents the case , obtained for , , and . The figure at the bottom shows some examples of the special case (circulant graph) for , namely (cyclic graph), , and finally (complete graph)

Mentions: Now we show an explicit example of this technique, namely the case when the blocks of the matrix T have the following symmetric circulant band structure: 6.9\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\begin{aligned} &\mathfrak{B}^{ (i )}=\left [ \textstyle\begin{array}{@{}c@{\quad}c@{\quad}c@{\quad}c@{\quad}c@{\quad }c@{\quad}c@{\quad}c@{\quad}c@{\quad}c@{}} 1-\delta_{i0} & 1 & \cdots& 1 & 0 & \cdots& 0 & 1 & \cdots& 1 \\ 1 & 1-\delta_{i0} & \ddots& & \ddots& \ddots& & \ddots& \ddots& \vdots \\ \vdots& \ddots& \ddots& \ddots& & \ddots& \ddots& & \ddots& 1 \\ 1 & & \ddots& \ddots& \ddots& & \ddots& \ddots& & 0 \\ 0 & \ddots& & \ddots& \ddots& \ddots& & \ddots& \ddots& \vdots \\ \vdots& & \ddots& & \ddots& \ddots& \ddots& & \ddots& 0 \\ 0 & & & \ddots& & \ddots& \ddots& \ddots& & 1 \\ 1 & \ddots& & & \ddots& & \ddots& \ddots& \ddots& \vdots \\ \vdots& \ddots& \ddots& & & \ddots& & \ddots& 1-\delta_{i0} & 1 \\ 1 & \cdots& 1 & 0 & \cdots& 0 & 1 & \cdots& 1 & 1-\delta_{i0} \end{array}\displaystyle \right ] \end{aligned}$$ \end{document}B(i)=[1−δi01⋯10⋯01⋯111−δi0⋱⋱⋱⋱⋱⋮⋮⋱⋱⋱⋱⋱⋱11⋱⋱⋱⋱⋱00⋱⋱⋱⋱⋱⋱⋮⋮⋱⋱⋱⋱⋱00⋱⋱⋱⋱11⋱⋱⋱⋱⋱⋮⋮⋱⋱⋱⋱1−δi011⋯10⋯01⋯11−δi0] where, supposing for simplicity that , the first row of (excluding the term , which is 0 for and 1 for ) can be written explicitly as \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\begin{aligned} \textstyle\begin{cases} \mathfrak{b}_{j}^{ (i )}= 1, & (1\leq j\leq\xi_{i} )\vee (\varrho_{i}\leq j\leq G-1 ), \\ \mathfrak{b}_{j}^{ (i )}= 0, & \xi_{i}< j< \varrho_{i}, \end{cases}\displaystyle \displaystyle \displaystyle \\ \begin{aligned} \varrho_{i}&= G-\xi_{i}+H \biggl(\xi_{i}- \biggl\lfloor \frac {G}{2} \biggr\rfloor + (-1 )^{G} \biggr), \\ H (x )&= \textstyle\begin{cases} 0, & x\leq0, \\ 1, & x>0, \end{cases}\displaystyle \displaystyle \displaystyle \end{aligned} \end{aligned}$$ \end{document}{bj(i)=1,(1≤j≤ξi)∨(ϱi≤j≤G−1),bj(i)=0,ξi<j<ϱi,ϱi=G−ξi+H(ξi−⌊G2⌋+(−1)G),H(x)={0,x≤0,1,x>0, where , while is the Heaviside step function. Here we have to suppose that because otherwise it is not possible to distinguish the diagonal band from the corner elements. Now, the bandwidth of is , so this defines the integer parameters . Moreover, represents the number of connections that every neuron in a given population receives from the neurons in the same population. Instead , for , is the number of connections that every neuron in the kth population receives from the neurons in the th modF population, for . So the total number of incoming connections per neuron is . The graph with this special block-circulant adjacency matrix will be represented by the notation , and some examples are shown in Fig. 2 for different values of F and ξ. This can be considered as a toy model for describing a network of F cortical columns containing G neurons each. The parameters can be adjusted in order to generate local and long-range connections compatible with recent neuroanatomical studies [45], providing a rough description of a wide area of neural tissue. This idea will be extended to the case of irregular graphs in Sect. 6.3.2. Moreover, it is important to observe that even if in this case all the matrices are symmetric, the matrix T is not, since the number of connections in every block is different (the case of symmetric connectivity matrices is studied in Sect. 6.2). Fig. 2


A formalism for evaluating analytically the cross-correlation structure of a firing-rate network model.

Fasoli D, Faugeras O, Panzeri S - J Math Neurosci (2015)

Examples of the block-circulant graphs for different values of F and ξ, with G fixed. The figure on the top represents the case , obtained for , , and . The figure at the bottom shows some examples of the special case  (circulant graph) for , namely  (cyclic graph), , and finally  (complete graph)
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4385226&req=5

Fig2: Examples of the block-circulant graphs for different values of F and ξ, with G fixed. The figure on the top represents the case , obtained for , , and . The figure at the bottom shows some examples of the special case (circulant graph) for , namely (cyclic graph), , and finally (complete graph)
Mentions: Now we show an explicit example of this technique, namely the case when the blocks of the matrix T have the following symmetric circulant band structure: 6.9\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\begin{aligned} &\mathfrak{B}^{ (i )}=\left [ \textstyle\begin{array}{@{}c@{\quad}c@{\quad}c@{\quad}c@{\quad}c@{\quad }c@{\quad}c@{\quad}c@{\quad}c@{\quad}c@{}} 1-\delta_{i0} & 1 & \cdots& 1 & 0 & \cdots& 0 & 1 & \cdots& 1 \\ 1 & 1-\delta_{i0} & \ddots& & \ddots& \ddots& & \ddots& \ddots& \vdots \\ \vdots& \ddots& \ddots& \ddots& & \ddots& \ddots& & \ddots& 1 \\ 1 & & \ddots& \ddots& \ddots& & \ddots& \ddots& & 0 \\ 0 & \ddots& & \ddots& \ddots& \ddots& & \ddots& \ddots& \vdots \\ \vdots& & \ddots& & \ddots& \ddots& \ddots& & \ddots& 0 \\ 0 & & & \ddots& & \ddots& \ddots& \ddots& & 1 \\ 1 & \ddots& & & \ddots& & \ddots& \ddots& \ddots& \vdots \\ \vdots& \ddots& \ddots& & & \ddots& & \ddots& 1-\delta_{i0} & 1 \\ 1 & \cdots& 1 & 0 & \cdots& 0 & 1 & \cdots& 1 & 1-\delta_{i0} \end{array}\displaystyle \right ] \end{aligned}$$ \end{document}B(i)=[1−δi01⋯10⋯01⋯111−δi0⋱⋱⋱⋱⋱⋮⋮⋱⋱⋱⋱⋱⋱11⋱⋱⋱⋱⋱00⋱⋱⋱⋱⋱⋱⋮⋮⋱⋱⋱⋱⋱00⋱⋱⋱⋱11⋱⋱⋱⋱⋱⋮⋮⋱⋱⋱⋱1−δi011⋯10⋯01⋯11−δi0] where, supposing for simplicity that , the first row of (excluding the term , which is 0 for and 1 for ) can be written explicitly as \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\begin{aligned} \textstyle\begin{cases} \mathfrak{b}_{j}^{ (i )}= 1, & (1\leq j\leq\xi_{i} )\vee (\varrho_{i}\leq j\leq G-1 ), \\ \mathfrak{b}_{j}^{ (i )}= 0, & \xi_{i}< j< \varrho_{i}, \end{cases}\displaystyle \displaystyle \displaystyle \\ \begin{aligned} \varrho_{i}&= G-\xi_{i}+H \biggl(\xi_{i}- \biggl\lfloor \frac {G}{2} \biggr\rfloor + (-1 )^{G} \biggr), \\ H (x )&= \textstyle\begin{cases} 0, & x\leq0, \\ 1, & x>0, \end{cases}\displaystyle \displaystyle \displaystyle \end{aligned} \end{aligned}$$ \end{document}{bj(i)=1,(1≤j≤ξi)∨(ϱi≤j≤G−1),bj(i)=0,ξi<j<ϱi,ϱi=G−ξi+H(ξi−⌊G2⌋+(−1)G),H(x)={0,x≤0,1,x>0, where , while is the Heaviside step function. Here we have to suppose that because otherwise it is not possible to distinguish the diagonal band from the corner elements. Now, the bandwidth of is , so this defines the integer parameters . Moreover, represents the number of connections that every neuron in a given population receives from the neurons in the same population. Instead , for , is the number of connections that every neuron in the kth population receives from the neurons in the th modF population, for . So the total number of incoming connections per neuron is . The graph with this special block-circulant adjacency matrix will be represented by the notation , and some examples are shown in Fig. 2 for different values of F and ξ. This can be considered as a toy model for describing a network of F cortical columns containing G neurons each. The parameters can be adjusted in order to generate local and long-range connections compatible with recent neuroanatomical studies [45], providing a rough description of a wide area of neural tissue. This idea will be extended to the case of irregular graphs in Sect. 6.3.2. Moreover, it is important to observe that even if in this case all the matrices are symmetric, the matrix T is not, since the number of connections in every block is different (the case of symmetric connectivity matrices is studied in Sect. 6.2). Fig. 2

Bottom Line: In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs.To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent.Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

View Article: PubMed Central - PubMed

Affiliation: NeuroMathComp Laboratory, INRIA Sophia Antipolis Méditerranée, 2004 Route des Lucioles, BP 93, 06902 Valbonne, France ; Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @Unitn, Istituto Italiano di Tecnologia, 38068 Rovereto, Italy.

ABSTRACT
We introduce a new formalism for evaluating analytically the cross-correlation structure of a finite-size firing-rate network with recurrent connections. The analysis performs a first-order perturbative expansion of neural activity equations that include three different sources of randomness: the background noise of the membrane potentials, their initial conditions, and the distribution of the recurrent synaptic weights. This allows the analytical quantification of the relationship between anatomical and functional connectivity, i.e. of how the synaptic connections determine the statistical dependencies at any order among different neurons. The technique we develop is general, but for simplicity and clarity we demonstrate its efficacy by applying it to the case of synaptic connections described by regular graphs. The analytical equations so obtained reveal previously unknown behaviors of recurrent firing-rate networks, especially on how correlations are modified by the external input, by the finite size of the network, by the density of the anatomical connections and by correlation in sources of randomness. In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs. Moreover we prove that in general it is not possible to find a mean-field description à la Sznitman of the network, if the anatomical connections are too sparse or our three sources of variability are correlated. To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent. Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

No MeSH data available.


Related in: MedlinePlus